-
-
Notifications
You must be signed in to change notification settings - Fork 52.6k
Description
GitHub Issue Draft: GLM-5 Model Switching Fails for Main Agent
Summary
GLM-5 (zai/glm-5) works perfectly when configured as an agent's model in agents.list, but switching the main agent to GLM-5 fails via both /model command and config.patch.
Environment
- OpenClaw version: 2026.2.9 (33c75cb)
- Platform: macOS (Darwin 25.3.0 arm64)
- Provider: ZAI (api.z.ai)
- Model: GLM-5
What Works ✅
Stilgar agent (configured in agents.list with zai/glm-5):
- Responds normally
- Shows correct model in session history:
"api": "openai-completions", "provider": "zai", "model": "glm-5" - Handles 35K+ token contexts without issue
Direct API test (curl to ZAI API):
curl -X POST https://api.z.ai/api/coding/paas/v4/chat/completions \
-H "Authorization: Bearer $ZAI_API_KEY" \
-H "Content-Type: application/json" \
-d '{
"model": "glm-5",
"messages": [{"role": "user", "content": "Hello"}],
"stream": true
}'✅ Works perfectly, returns reasoning_content in stream
What Fails ❌
Method 1: /model Command
User: /model glm
System: Model switched to glm (zai/glm-5).
User: hello
[No response]
Logs:
2026-02-13T18:55:09.350Z No reply from agent.
Session status after switch:
🧠 Model: zai/glm-5
🧮 Tokens: 5 in / 412 out
Only 5 tokens sent — suggests malformed/empty request.
Method 2: config.patch Agent Model
Changed agents.list main agent model from anthropic/claude-opus-4-6 to zai/glm-5:
{
"agents": {
"list": [
{"id":"main","model":"zai/glm-5"},
{"id":"leto","workspace":"...","model":"openrouter/minimax/minimax-m2.5"},
{"id":"stilgar","workspace":"...","model":"zai/glm-5"}
]
}
}Result: Config saves, gateway restarts (SIGUSR1), but session_status still shows:
🧠 Model: anthropic/claude-sonnet-4-5
Even after full openclaw gateway stop && openclaw gateway start, session stays on Sonnet.
Configuration
ZAI Provider Config
{
"models": {
"providers": {
"zai": {
"baseUrl": "https://api.z.ai/api/coding/paas/v4",
"apiKey": "...",
"api": "openai-completions",
"models": [
{
"id": "glm-5",
"name": "GLM 5",
"reasoning": true,
"input": ["text"],
"contextWindow": 128000,
"maxTokens": 16384
}
]
}
}
}
}Model Alias
{
"agents": {
"defaults": {
"models": {
"zai/glm-5": {
"alias": "glm"
}
}
}
}
}Hypothesis
/modelcommand accepts the switch but fails to initialize the ZAI provider correctly for the main sessionconfig.patchupdates the config file but SIGUSR1 hot reload doesn't pick up agent model changes- Something about the main session preserves its original model binding across restarts
Tested Parameters
Verified ZAI API accepts all these parameters (none cause 400):
- ✅
tools - ✅
stream_options - ✅
reasoning_effort - ✅
prediction - ✅ Anthropic cache control markers
Expected Behavior
Switching the main agent to zai/glm-5 via either /model glm or config.patch should work the same way it does for Stilgar's agent config.
Workaround
None found. Only way to use GLM-5 is to create a separate agent with the model pre-configured in agents.list.
Additional Context
- Stilgar's agent has been running on
zai/glm-5successfully for days - Same ZAI API key, same provider config
- Main agent defaults to
openrouter/autoinagents.defaults.model.primary— might be fallback related?