A beautiful web client for OpenClaw.
curl -fsSL https://opencami.xyz/install.sh | bashnpm install -g opencamiopencami --gateway ws://127.0.0.1:18789 --token <GATEWAY_TOKEN>Then open: http://localhost:3000
This is the safest setup: Gateway stays on loopback, you access OpenCami via https://<magicdns>:<port>.
- In OpenClaw, allowlist the exact OpenCami URL (no trailing slash):
{
"gateway": {
"controlUI": {
"allowedOrigins": ["https://<magicdns>:3001"]
}
}
}- Restart the gateway:
openclaw gateway restart- Start OpenCami with the same origin:
opencami \
--gateway ws://127.0.0.1:18789 \
--token <GATEWAY_TOKEN> \
--origin https://<magicdns>:3001
β οΈ Note:--gatewaymust bews://orwss://(nothttps://).
opencami [--port <n>] [--host <addr>] [--gateway <ws(s)://...>] [--token <token>] [--password <pw>] [--origin <url>] [--no-open]
--port <n> Port to listen on (default: 3000)
--host <addr> Host to bind to (default: 127.0.0.1)
--gateway <url> OpenClaw gateway WS URL (default: ws://127.0.0.1:18789)
--token <token> Gateway token (sets CLAWDBOT_GATEWAY_TOKEN)
--password <pw> Gateway password (sets CLAWDBOT_GATEWAY_PASSWORD)
--origin <url> Origin header for backend WS (sets OPENCAMI_ORIGIN)
--no-open Don't open browser on start
-h, --help Show help
You can also set env vars instead of flags:
CLAWDBOT_GATEWAY_URL=ws://127.0.0.1:18789
CLAWDBOT_GATEWAY_TOKEN=...
OPENCAMI_ORIGIN=https://<magicdns>:3001 # only needed for remote HTTPS- "origin not allowed" β add the exact URL to
gateway.controlUI.allowedOriginsand pass the same value as--origin/OPENCAMI_ORIGIN(exact match, no trailing/). - Pairing required β approve the device in OpenClaw (
openclaw devices list/approve). - Fallback (only if needed):
OPENCAMI_DEVICE_AUTH_FALLBACK=1
- Prefer
wss://for remote connections. - Prefer token auth (
CLAWDBOT_GATEWAY_TOKEN) over password. - Keep
allowedOriginsminimal (exact origins only, no wildcards). - Treat
OPENCAMI_DEVICE_AUTH_FALLBACK=trueas temporary compatibility mode. - Do not expose OpenCami directly to the public internet without TLS + access controls.
- For Tailnet deployments, limit Tailnet device/user access.
Cause: gateway rejected browser origin.
Fix:
- Add origin to
gateway.controlUI.allowedOrigins(exact match, no trailing/) - Set identical
OPENCAMI_ORIGIN(or--origin) in OpenCami - Restart gateway (
openclaw gateway restart)
Cause: gateway auth succeeded but the device was paired with insufficient scopes.
Fix (v1.8.5+): delete the device identity and let OpenCami re-pair automatically:
rm ~/.opencami/identity/device.json
# then restart OpenCami β it will re-pair with full scopesOn first connect, OpenCami registers itself as a device on the gateway. Starting with v1.8.5, this happens automatically with full scopes (operator.admin, operator.approvals, operator.pairing) β no manual config required.
If you see a "device pending" error:
openclaw devices list # find the pending device
openclaw devices approve <deviceId>After approval, OpenCami reconnects and stores a deviceToken for future sessions (no shared token needed).
Checks:
openclaw gateway status
echo "$CLAWDBOT_GATEWAY_URL"
echo "$CLAWDBOT_GATEWAY_TOKEN"Also verify URL scheme (ws:// local, wss:// remote).
docker build -t opencami .
docker run -p 3000:3000 opencami- β‘ Real-time streaming β persistent WebSocket + SSE, token-by-token
- π File attachments β upload PDFs, text, code, CSV, JSON via attach button or drag & drop (
/uploads/+readtool workflow) - π File cards β uploaded files render as clickable cards (filename, icon, size) and open in File Explorer
- πΌοΈ Image attachments β drag & drop with compression (images stay Base64 for vision)
- π Voice playback (TTS) β ElevenLabs β OpenAI β Edge TTS fallback
- π€ Voice input (STT) β ElevenLabs Scribe β OpenAI Whisper β Browser
- π Browser notifications β background tab alerts when assistant replies
- π·οΈ Smart titles β LLM-generated session titles
- π‘ Smart follow-ups β contextual suggestions after each response
- π§ Thinking level toggle β reasoning depth (off/low/medium/high) per message
- π Search sources badge β see which search engines were used
- π Context window meter β visual token usage indicator
- π File explorer β browse & edit 30+ file types with built-in editor
- π§ Memory viewer β browse and edit MEMORY.md and daily memory files
- π€ Agent manager β create, edit, delete agents from the sidebar
- π§© Skills browser β discover and install skills from ClawHub
- β° Cron jobs panel β manage scheduled automations
- π§ Workspace settings β toggle each tool on/off in Settings
- π¨ Model selector β switch AI models per message
- π Persona picker β 20 AI personalities
- π¦ Chameleon theme β light/dark/system with accent colors
- π€ Text size β S / M / L / XL
- π Multi-provider LLM β OpenAI, OpenRouter, Ollama, or custom
- π Session folders β grouped by kind (chats, subagents, cron, other)
- π Pin sessions β pinned always on top
- ποΈ Bulk delete β select multiple sessions, delete at once
- π‘οΈ Protected sessions β prevent accidental deletion
- π₯ Export β Markdown, JSON, or plain text
- π± PWA β installable, offline shell, auto-update
- π₯οΈ Tauri desktop app (Beta) β native wrapper for macOS/Windows/Linux
- β¨οΈ Keyboard shortcuts β full power-user navigation
- π¬ Slash commands β inline help and actions
- π Conversation search β current (βF) and global (ββ§F)
git clone https://github.com/robbyczgw-cla/opencami.git
cd opencami
npm install
cp .env.example .env.local
npm run devThen open the URL printed by Vite in your terminal.
Dev port notes: this repo's
npm run devscript uses port3002. If you run Vite directly with the config default, it targets3003and auto-falls back to the next free port.
Note: The desktop app is experimental and under active development. The primary focus of OpenCami is the web app. Native builds (desktop & mobile) are secondary.
OpenCami can also run as a native macOS/Windows/Linux desktop wrapper built with Tauri v2. The app loads your self-hosted OpenCami web instance.
- Node.js 18+
- Rust toolchain (
rustup)
# Install dependencies (if not already done)
npm install
# Build web assets first
npm run build
# Build desktop app
npm run tauri:buildBy default, the desktop app connects to http://localhost:3003.
To override at build time:
OPENCAMI_REMOTE_URL="https://your-server.example.com" npm run tauri:buildBuilt installers/bundles are written to src-tauri/target/release/bundle/:
- macOS:
.app,.dmg - Windows:
.exe,.msi - Linux:
.deb,.AppImage
- Tray icon (hide to tray on close)
- Native notifications
- Auto-start on login
- Custom titlebar
- Multiple windows (βN)
- Clipboard integration
npm run tauri:devRequires a display/GUI environment.
Built on top of WebClaw by @ibelick.
File Explorer by @balin-ar (PR #2).
Dockerfile by @deblanco (PR #7).
Powered by OpenClaw.
- π opencami.xyz
- π¦ npm
- π» GitHub
