Version
openclaw 2026.4.20 (115f05d) (also reproduces against latest 2026.5.4 per CHANGELOG inspection)
- Windows 11, Node.js (via nvm4w), PowerShell 7
Summary
On every CLI invocation — including non-interactive --json reads — the CLI emits
Ollama could not be reached at http://127.0.0.1:11434.
to stderr, regardless of whether Ollama is in agents.defaults.model.primary, agents.defaults.model.fallbacks, the model catalog, or any auth profile. There is no ollama/* model anywhere in my configuration.
This corrupts stderr capture in any programmatic consumer (frontends, CI scripts, GitHub Actions runners, daemon supervisors) and forces every consumer to filter the noise.
Reproduction
With no Ollama configuration anywhere in ~/.openclaw/openclaw.json:
> openclaw agents list --json 2>&1 | Out-String
[
{ "id": "main", "model": "openai/gpt-5.4-mini", ... },
...
]
node.exe : Ollama could not be reached at http://127.0.0.1:11434.
> openclaw sessions --json 2>&1 | Out-String
{ "path": "...", "count": 69, "sessions": [...] }
node.exe : Ollama could not be reached at http://127.0.0.1:11434.
> openclaw models list --json 2>&1 | Out-String
node.exe : Ollama could not be reached at http://127.0.0.1:11434.
{ "count": 7, "models": [...] }
models list --json returns 7 models, none of them Ollama: openai/gpt-5.4-mini, openai/gpt-4o, vllm/nvidia/Qwen3-30B-A3B-NVFP4, openai/gpt-5.4, openai/gpt-4o-mini, openai/o4-mini, nvidia/Qwen3-30B-A3B-NVFP4.
Expected behavior
The Ollama provider probe should run only when Ollama is referenced by the active agent's model chain or auth profiles. If a probe must run for catalog refresh, its unreachable-host message should be at debug level, not warn/stderr.
Impact
We ship a desktop frontend (Crystal: https://github.com/jvpflum/Crystal) on top of OpenClaw and currently filter this string in two places to avoid leaking the warning into user-facing chat output and into gateway-log views. Every JSON consumer needs the same workaround. A literal-string search for could not be reached at http://127.0.0.1:11434 will hit other downstream tools too.
Suggested fixes (any one is sufficient)
- Demote unconfigured probes to debug. If no
ollama/* reference is found in the resolved model chain, suppress the warning entirely.
- Honor
OPENCLAW_LOG_LEVEL. Currently the warning prints regardless of level.
- Probe lazily. Only probe Ollama on the first dispatch that targets it, not on every CLI startup.
Happy to send a PR for option 1 if there's appetite.
Version
openclaw 2026.4.20 (115f05d)(also reproduces against latest2026.5.4per CHANGELOG inspection)Summary
On every CLI invocation — including non-interactive
--jsonreads — the CLI emitsto stderr, regardless of whether Ollama is in
agents.defaults.model.primary,agents.defaults.model.fallbacks, the model catalog, or any auth profile. There is noollama/*model anywhere in my configuration.This corrupts stderr capture in any programmatic consumer (frontends, CI scripts, GitHub Actions runners, daemon supervisors) and forces every consumer to filter the noise.
Reproduction
With no Ollama configuration anywhere in
~/.openclaw/openclaw.json:models list --jsonreturns 7 models, none of them Ollama:openai/gpt-5.4-mini,openai/gpt-4o,vllm/nvidia/Qwen3-30B-A3B-NVFP4,openai/gpt-5.4,openai/gpt-4o-mini,openai/o4-mini,nvidia/Qwen3-30B-A3B-NVFP4.Expected behavior
The Ollama provider probe should run only when Ollama is referenced by the active agent's model chain or auth profiles. If a probe must run for catalog refresh, its unreachable-host message should be at
debuglevel, notwarn/stderr.Impact
We ship a desktop frontend (Crystal: https://github.com/jvpflum/Crystal) on top of OpenClaw and currently filter this string in two places to avoid leaking the warning into user-facing chat output and into gateway-log views. Every JSON consumer needs the same workaround. A literal-string search for
could not be reached at http://127.0.0.1:11434will hit other downstream tools too.Suggested fixes (any one is sufficient)
ollama/*reference is found in the resolved model chain, suppress the warning entirely.OPENCLAW_LOG_LEVEL. Currently the warning prints regardless of level.Happy to send a PR for option 1 if there's appetite.