Bug type
Behavior bug (incorrect output/state without crash)
Beta release blocker
No
Summary
Hi, I’m seeing a local Ollama issue on macOS.
Direct Ollama API calls to http://127.0.0.1:11434 work normally and return quickly, but OpenClaw web chat keeps spinning and logs show ollama timeouts in the embedded agent path.
I tested on OpenClaw 2026.4.2 and 2026.4.1, cleared sessions/fallbacks, and reset local state. I can provide logs if helpful.
Steps to reproduce
Environment
macOS
Ollama running locally at http://127.0.0.1:11434
Tested with:
OpenClaw 2026.4.2
OpenClaw 2026.4.1
Default model set to ollama/llama3.2:latest
Fallbacks cleared.
Problem
In the OpenClaw web UI, sending a message causes the chat to keep spinning indefinitely. Sometimes only the Assistant timestamp appears, with no message body.
This also affects the agent path, not just the browser UI.
Important finding
Direct calls to the local Ollama API work correctly and quickly. This request returned "2" in about 0.755s, so Ollama itself appears healthy:
time curl -s http://127.0.0.1:11434/api/generate
-d '{"model":"llama3.2:latest","prompt":"1+1等于几?请只回答一个数字。","stream":false}'
What I already tried
Installation / environment
Reinstalled Homebrew
Reinstalled OpenClaw
Downgraded from 2026.4.2 to 2026.4.1
Model / config
Cleared all model fallbacks
Confirmed default model is ollama/llama3.2:latest
Repeatedly cleaned OLLAMA_API_KEY / ollama-local related config from:
shell config
~/.openclaw/openclaw.json
~/.openclaw/agents/main/agent/models.json
~/.openclaw/agents/main/agent/auth-profiles.json
Re-ran openclaw models auth login --provider ollama
Selected:
base URL: http://127.0.0.1:11434
mode: Local
Session / state cleanup
Cleared active sessions.json to []
Disabled old session .jsonl files that referenced mistralai/mistral-7b-instruct
Cleared / modified BOOTSTRAP.md and HEARTBEAT.md
Temporarily moved aside:
~/.openclaw/workspace
~/.openclaw/agents/main/agent
Even after full reset, issue persisted
Logs / symptoms
- OpenClaw times out when calling local Ollama
Recent logs show:
Profile ollama:default timed out
requests ending with timeout even when fallback is disabled.
2. OpenClaw hot-reloads Ollama API key config
Logs also show:
config change detected ... (models.providers.ollama.apiKey)
config hot reload applied (models.providers.ollama.apiKey)
3. Old Mistral route was present before cleanup
Old session state referenced:
openrouter/mistralai/mistral-7b-instruct
assistant messages with empty content and 404 No endpoints found...
After cleanup, active sessions.json became [], and those references remained only in disabled / backup files.
Current conclusion
This does not look like an Ollama issue:
direct Ollama API works
local model responds quickly
issue persists inside OpenClaw’s local web chat / embedded agent path.
It appears to be an OpenClaw local-agent/Ollama integration bug on this setup.
Questions
Is there a known issue in OpenClaw 2026.4.1 / 2026.4.2 with local Ollama timing out in web chat or embedded agent mode?
Why is models.providers.ollama.apiKey being hot-reloaded during runtime?
Why does OpenClaw time out against local Ollama even though direct Ollama API calls succeed immediately?
If you want, I can also turn this into a very short GitHub issue version.
Expected behavior
no reply
Actual behavior
no reply
OpenClaw version
2024.04.02
Operating system
macos 15.4
Install method
all
Model
ollama
Provider / routing chain
openclaw
Additional provider/model setup details
No response
Logs, screenshots, and evidence
Impact and severity
No response
Additional information
No response
Bug type
Behavior bug (incorrect output/state without crash)
Beta release blocker
No
Summary
Hi, I’m seeing a local Ollama issue on macOS.
Direct Ollama API calls to http://127.0.0.1:11434 work normally and return quickly, but OpenClaw web chat keeps spinning and logs show ollama timeouts in the embedded agent path.
I tested on OpenClaw 2026.4.2 and 2026.4.1, cleared sessions/fallbacks, and reset local state. I can provide logs if helpful.
Steps to reproduce
Environment
macOS
Ollama running locally at http://127.0.0.1:11434
Tested with:
OpenClaw 2026.4.2
OpenClaw 2026.4.1
Default model set to ollama/llama3.2:latest
Fallbacks cleared.
Problem
In the OpenClaw web UI, sending a message causes the chat to keep spinning indefinitely. Sometimes only the Assistant timestamp appears, with no message body.
This also affects the agent path, not just the browser UI.
Important finding
Direct calls to the local Ollama API work correctly and quickly. This request returned "2" in about 0.755s, so Ollama itself appears healthy:
time curl -s http://127.0.0.1:11434/api/generate
-d '{"model":"llama3.2:latest","prompt":"1+1等于几?请只回答一个数字。","stream":false}'
What I already tried
Installation / environment
Reinstalled Homebrew
Reinstalled OpenClaw
Downgraded from 2026.4.2 to 2026.4.1
Model / config
Cleared all model fallbacks
Confirmed default model is ollama/llama3.2:latest
Repeatedly cleaned OLLAMA_API_KEY / ollama-local related config from:
shell config
~/.openclaw/openclaw.json
~/.openclaw/agents/main/agent/models.json
~/.openclaw/agents/main/agent/auth-profiles.json
Re-ran openclaw models auth login --provider ollama
Selected:
base URL: http://127.0.0.1:11434
mode: Local
Session / state cleanup
Cleared active sessions.json to []
Disabled old session .jsonl files that referenced mistralai/mistral-7b-instruct
Cleared / modified BOOTSTRAP.md and HEARTBEAT.md
Temporarily moved aside:
~/.openclaw/workspace
~/.openclaw/agents/main/agent
Even after full reset, issue persisted
Logs / symptoms
Recent logs show:
Profile ollama:default timed out
requests ending with timeout even when fallback is disabled.
2. OpenClaw hot-reloads Ollama API key config
Logs also show:
config change detected ... (models.providers.ollama.apiKey)
config hot reload applied (models.providers.ollama.apiKey)
3. Old Mistral route was present before cleanup
Old session state referenced:
openrouter/mistralai/mistral-7b-instruct
assistant messages with empty content and 404 No endpoints found...
After cleanup, active sessions.json became [], and those references remained only in disabled / backup files.
Current conclusion
This does not look like an Ollama issue:
direct Ollama API works
local model responds quickly
issue persists inside OpenClaw’s local web chat / embedded agent path.
It appears to be an OpenClaw local-agent/Ollama integration bug on this setup.
Questions
Is there a known issue in OpenClaw 2026.4.1 / 2026.4.2 with local Ollama timing out in web chat or embedded agent mode?
Why is models.providers.ollama.apiKey being hot-reloaded during runtime?
Why does OpenClaw time out against local Ollama even though direct Ollama API calls succeed immediately?
If you want, I can also turn this into a very short GitHub issue version.
Expected behavior
no reply
Actual behavior
no reply
OpenClaw version
2024.04.02
Operating system
macos 15.4
Install method
all
Model
ollama
Provider / routing chain
openclaw
Additional provider/model setup details
No response
Logs, screenshots, and evidence
Impact and severity
No response
Additional information
No response