Bug Description
OpenClaw strips the anthropic-beta: max-model-output-200k-2025-02-19 (context1m) header when the Anthropic auth profile uses OAuth token mode (mode: token). The log emits:
ignoring context1m for OAuth token auth on anthropic/claude-opus-4-6; Anthropic rejects context-1m beta with OAuth auth
This fires on every single request, regardless of model or context size.
Impact
- Users on the Anthropic Max plan who authenticate via OAuth tokens cannot use extended context (>200K), even though their plan supports it.
- Under Anthropic API stress/enforcement periods, these requests may be rate-limited or rejected more aggressively because they lack the proper beta header.
- The warning fires continuously in logs (every few seconds during active sessions), creating log noise.
- Sessions that build up context over time hit 429 rate limits and become unresponsive, requiring manual session store deletion to recover.
Environment
- OpenClaw version: 2026.3.13 (61d171a) — also confirmed present in 2026.3.28
- OS: macOS (Darwin arm64)
- Node: v22.22.0
- Auth config:
{
"auth": {
"profiles": {
"anthropic:manual": {
"provider": "anthropic",
"mode": "token"
}
}
}
}
- Model: anthropic/claude-opus-4-6 (also fires for claude-sonnet-4-6)
- Plan: Anthropic Max (supports extended context)
Expected Behavior
The context1m beta header should be sent for OAuth token auth users, or at minimum there should be a config override (e.g., forceContext1m: true or anthropicBeta: ["max-model-output-200k-2025-02-19"]) that allows Max plan users to opt in.
Actual Behavior
The header is unconditionally stripped for all OAuth token auth profiles. No config override exists.
Reproduction
- Configure Anthropic provider with
mode: token (OAuth)
- Start a session and send any message
- Observe the warning in logs:
ignoring context1m for OAuth token auth
- As the session grows in context, requests begin hitting 429 rate limits
- Eventually the session becomes unresponsive — only recoverable by deleting
sessions.json and restarting
Suggested Fix
Either:
- Pass the context1m header through for OAuth token auth (Anthropic's API may now accept it for Max plan tokens)
- Add a provider-level config option to force the beta header regardless of auth mode
- Add an auth profile flag like
context1m: true that overrides the stripping behavior
Workaround
Delete ~/.openclaw/agents/main/sessions/sessions.json and restart the gateway when sessions become unresponsive. This is not sustainable for production use.
Bug Description
OpenClaw strips the
anthropic-beta: max-model-output-200k-2025-02-19(context1m) header when the Anthropic auth profile uses OAuth token mode (mode: token). The log emits:This fires on every single request, regardless of model or context size.
Impact
Environment
{ "auth": { "profiles": { "anthropic:manual": { "provider": "anthropic", "mode": "token" } } } }Expected Behavior
The context1m beta header should be sent for OAuth token auth users, or at minimum there should be a config override (e.g.,
forceContext1m: trueoranthropicBeta: ["max-model-output-200k-2025-02-19"]) that allows Max plan users to opt in.Actual Behavior
The header is unconditionally stripped for all OAuth token auth profiles. No config override exists.
Reproduction
mode: token(OAuth)ignoring context1m for OAuth token authsessions.jsonand restartingSuggested Fix
Either:
context1m: truethat overrides the stripping behaviorWorkaround
Delete
~/.openclaw/agents/main/sessions/sessions.jsonand restart the gateway when sessions become unresponsive. This is not sustainable for production use.