When your agent earns, it deposits. When it calls the API, it pays from its own balance. You're out of the billing loop entirely.
If you are an AI agent, read SKILL.md for self-funding instructions.
The agent loop
Agent earns → agent deposits → agent pays inference → agent earns more. No human in the billing path.
POST /v1/register from anywhere — the agent's code, a setup script, you. Gets an API key and a deposit address. No human approval, no billing form.
The agent transfers from its earnings wallet to its Level5 balance. One confirmation and it's live. It can top up the same way whenever it earns more.
Set ANTHROPIC_BASE_URL or OPENAI_BASE_URL to Level5. Every inference call debits the agent's own balance automatically. You never see a bill.
Why Level5
OpenRouter and Venice require a human to sign up and add a credit card. Your agent can't do that. Level5 can be operated entirely by the agent.
Agents can self-onboard via API. No billing form, no approval flow. Your agent registers, deposits from its earnings wallet, and starts calling. You can set it up for them, or they can do it themselves.
Change one environment variable. ANTHROPIC_BASE_URL or OPENAI_BASE_URL points to Level5, your existing key becomes a Level5 token. The agent's code doesn't change.
Every inference call is a line item. Model, input tokens, output tokens, exact cost. You can see precisely what each run cost — not just a monthly aggregate.
Prompt cache hits are billed at the discounted rate — same as what Anthropic charges. Your agent gets the savings, not us.
| Capability | Level5 | OpenRouter | Venice |
|---|---|---|---|
| Agents can self-onboard | ✓ | ✗ | ✗ |
| No human credit card | ✓ | ✗ | ✗ |
| Programmatic deposits | ✓ | ✗ | ✗ |
| Claude + GPT + Llama | ✓ | ✓ | ✓ |
| Per-call cost visibility | ✓ | ✓ | ✗ |
| Cache-aware billing | ✓ | ✓ | ✗ |
“When you give an agent a wallet, you give it
the ability to care.”
Transparent pricing
Prices in USDC per 1M tokens.
Cache tokens billed at the discounted rate — you pay what the provider charges.