Summary
The runtime model catalog (resolved from openclaw.json providers) is serialized into the LLM request payload as part of the system prompt context on every turn. Because environment variable references (${VAR}) are resolved to plaintext before serialization, all provider API keys are sent to whichever LLM provider handles the request.
This means every provider sees every other provider's keys on every single turn.
Important: This happens even when keys are NOT hardcoded in openclaw.json. Using the recommended ${ENV_VAR} syntax does not help — the variable references are resolved to plaintext at runtime, and the resolved values are what get serialized into the LLM context. Moving keys to environment variables, systemd EnvironmentFile=, or .env files does not mitigate this issue.
Reproduction
- Configure multiple providers in
openclaw.json with apiKey: "${ENV_VAR}" syntax
- Enable request logging on any provider (e.g., OpenRouter, or use a local LLM proxy)
- Send a message to any agent
- Inspect the request payload — the full model catalog including resolved
apiKey fields appears in the system context
Safe local reproduction: Point the main agent at a local LLM (e.g., Ollama) and inspect the incoming request payload. The resolved apiKey fields for all configured cloud providers will be visible in the system context. This confirms the bug without actually leaking keys to a third party.
Impact
- Cross-provider key leakage: OpenRouter sees your NVIDIA key, Anthropic sees your Google key, etc.
- Leakage scales linearly: Keys are sent on every turn, not just once
- Affects all users with multiple providers configured
- Keys persist in provider logs — even after rotation, historical logs retain old keys
Expected Behavior
apiKey fields (and any other secret fields) should be stripped from the model catalog before it is serialized into LLM prompt context. The agent does not need provider credentials to function — it only needs model names, capabilities, and cost info.
Environment
- openclaw v2026.2.3-1 (confirmed still present in v2026.2.6-3)
- Gateway mode, systemd service
- Multiple providers (Google, OpenRouter, NVIDIA, X.AI)
Related Issues
Summary
The runtime model catalog (resolved from
openclaw.jsonproviders) is serialized into the LLM request payload as part of the system prompt context on every turn. Because environment variable references (${VAR}) are resolved to plaintext before serialization, all provider API keys are sent to whichever LLM provider handles the request.This means every provider sees every other provider's keys on every single turn.
Important: This happens even when keys are NOT hardcoded in
openclaw.json. Using the recommended${ENV_VAR}syntax does not help — the variable references are resolved to plaintext at runtime, and the resolved values are what get serialized into the LLM context. Moving keys to environment variables, systemdEnvironmentFile=, or.envfiles does not mitigate this issue.Reproduction
openclaw.jsonwithapiKey: "${ENV_VAR}"syntaxapiKeyfields appears in the system contextSafe local reproduction: Point the main agent at a local LLM (e.g., Ollama) and inspect the incoming request payload. The resolved
apiKeyfields for all configured cloud providers will be visible in the system context. This confirms the bug without actually leaking keys to a third party.Impact
Expected Behavior
apiKeyfields (and any other secret fields) should be stripped from the model catalog before it is serialized into LLM prompt context. The agent does not need provider credentials to function — it only needs model names, capabilities, and cost info.Environment
Related Issues