-
-
Notifications
You must be signed in to change notification settings - Fork 52.7k
Description
Summary
Context window not detected correctly from Ollama API - OpenClaw detects 4096 tokens instead of actual value (32768), preventing agent from responding
Steps to reproduce
-
Configure OpenClaw with Custom Provider pointing to local Ollama:
- Run: openclaw configure
- Provider: Custom Provider
- API Base URL: http://127.0.0.1:11434/v1
- Endpoint compatibility: OpenAI-compatible
- Model ID: qwen2.5:32b (or qwen14b-32k)
-
Start the gateway:
systemctl --user start openclaw-gateway -
Launch TUI and send a message:
openclaw tui
Type: "bonjour" -
Observe the error message
Expected behavior
OpenClaw should detect the correct context window from Ollama's API response (32768 tokens for qwen2.5:32b) and allow the agent to respond normally.
Verification that Ollama exposes the correct value:
curl -s http://127.0.0.1:11434/api/show -d '{"name":"qwen2.5:32b"}' | jq '.model_info."qwen2.context_length"'
Output: 32768
Actual behavior
OpenClaw detects 4096 tokens and blocks the agent with error:
"
The TUI shows: tokens ?/4.1k instead of tokens ?/32k
Logs show:
warn agent/embedded low context window: custom-127-0-0-1-11434/qwen2.5:32b ctx=4096 (warn<32000) source=modelsConfig
error agent/embedded blocked model (context window too small): custom-127-0-0-1-11434/qwen2.5:32b ctx=4096 (min=16000) source=modelsConfig
Additional issue: Attempting to manually configure contextWindow fails:
openclaw config set agents.defaults.models.custom-127-0-0-1-11434/qwen2.5:32b.contextWindow 32768
Error: "Config validation failed: Unrecognized key: contextWindow"
Workaround: Using Qwen via Alibaba Cloud OAuth (qwen-portal provider) works correctly.
OpenClaw version
2026.2.21-2 (35a57bc)
Operating system
Ubuntu 24.04.4 LTS Node.js: v22.22.0 Ollama: Latest version Models tested: qwen2.5:32b, qwen14b-32k
Install method
No response
Logs, screenshots, and evidence
Impact and severity
No response
Additional information
No response