-
-
Notifications
You must be signed in to change notification settings - Fork 52.6k
Closed
Labels
bugSomething isn't workingSomething isn't workingstaleMarked as stale due to inactivityMarked as stale due to inactivity
Description
Bug Report: Model Configuration Inconsistency
Summary
Models can be added to openclaw.json config but cannot be switched to via session_status tool due to undocumented allowlist restriction.
Environment
- OpenClaw Version: 2026.2.3-1
- OS: Linux 6.8.0-94-generic (x64)
- Node: v22.22.0
Steps to Reproduce
- Add a model to config using
gateway config.patch:
{
"agents": {
"defaults": {
"models": {
"venice/venice-uncensored": {}
}
}
}
}- Verify model appears in config:
grep -A 3 "venice-uncensored" ~/.openclaw/openclaw.jsonResult: Model is present in config ✅
- Attempt to switch to the model using
session_status:
session_status(model="venice/venice-uncensored")
Expected Behavior
Option A (Permissive):
- If model is in config,
session_statusshould accept it - Model switch succeeds
Option B (Restrictive):
- If model is not on allowlist, config.patch should reject it with clear error
- User knows immediately which models are supported
Actual Behavior
- Config accepts the model without error ❌
- Gateway restarts successfully ❌
session_statusrejects with:Model "venice/venice-uncensored" is not allowed.❌- No documentation about which models are approved ❌
- No way to discover the allowlist ❌
Approved Models (discovered through testing)
Only these two models work:
anthropic/claude-sonnet-4-5openrouter/google/gemini-2.5-flash
All other attempts (Venice, Llama, other OpenRouter models) fail with "not allowed" error.
Config State vs Runtime State
Config says: Model is configured and available
"models": {
"anthropic/claude-sonnet-4-5": { "alias": "sonnet" },
"openrouter/google/gemini-2.5-flash": { "alias": "flash" },
"venice/venice-uncensored": {}
}Runtime says: Model is not allowed
Error: Model "venice/venice-uncensored" is not allowed.
Impact
User Experience:
- Confusing: "I added it to config, why doesn't it work?"
- Time-wasting: Multiple restart cycles trying to diagnose
- False promise: OpenClaw appears model-agnostic but isn't in practice
Functional:
- Cannot access other models even with valid API keys
- Limits legitimate use cases (testing, research, different capabilities)
- Undermines "multi-model" value proposition
Suggested Fixes
Short-term:
- Document the allowlist clearly in config schema
- Add validation to config.patch that rejects unsupported models with helpful error
- Provide command to list approved models:
openclaw models list --allowed
Long-term:
- Remove allowlist restriction (trust user's config choices)
- Or make allowlist configurable via security policy setting
- Or provide clear pathway to add models to allowlist
Additional Context
Tested models that failed:
meta-llama/llama-3.1-405b-instructmeta-llama/llama-3.3-70b-instructvenice/venice-uncensoredvenice/hermes-3-llama-3.1-405bvenice/llama-3.3-70b
All were added to config successfully but rejected at runtime.
Reproducibility
Consistent across multiple attempts, multiple models, multiple config patches.
Related Files:
- Config:
~/.openclaw/openclaw.json - Models definition:
~/.openclaw/agents/main/agent/models.json - Tool:
session_status(built-in)
Expected vs Actual Matrix:
| Action | Expected | Actual | Result |
|---|---|---|---|
| Add unsupported model to config | Error or warning | Success | ❌ Misleading |
| Gateway restart after config change | Success | Success | ✅ Works |
| Switch to configured model | Success | Error: "not allowed" | ❌ Inconsistent |
Environment Details:
🦞 OpenClaw 2026.2.3-1 (d84eb46)
Runtime: agent=main | host=openclaw-01 | os=Linux 6.8.0-94-generic (x64) | node=v22.22.0
Default model: anthropic/claude-sonnet-4-5
Configured providers: anthropic (token), openrouter (token), venice (token)
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
bugSomething isn't workingSomething isn't workingstaleMarked as stale due to inactivityMarked as stale due to inactivity