fix: Auto-discover Ollama models without requiring explicit API key#4782
fix: Auto-discover Ollama models without requiring explicit API key#4782spiceoogway wants to merge 1 commit intoopenclaw:mainfrom
Conversation
a893a43 to
0ea97fc
Compare
0ea97fc to
8ecc1c6
Compare
Fixes openclaw#4544 When Ollama is running locally with models installed, they were showing as 'missing' in 'openclaw models list' because the provider was only registered if an explicit API key was configured. Since Ollama is a local service that doesn't require authentication, this change: - Auto-discovers Ollama models by attempting to query the local instance - Registers the provider if models are found - Uses a placeholder API key 'ollama-local' for local instances - Still respects explicit configuration if provided This aligns with how other local/unauthenticated providers work (e.g., qwen-portal with OAuth placeholder). Users can now use Ollama models immediately after installing Ollama and pulling models, without needing to configure authentication.
8ecc1c6 to
b888c82
Compare
| // Ollama provider - auto-discover if running locally | ||
| const ollamaProvider = await buildOllamaProvider(); | ||
| if (ollamaProvider.models.length > 0) { | ||
| const ollamaKey = | ||
| resolveEnvApiKeyVarName("ollama") ?? | ||
| resolveApiKeyFromProfiles({ provider: "ollama", store: authStore }) ?? | ||
| "ollama-local"; // Placeholder for local Ollama instance | ||
| providers.ollama = { ...ollamaProvider, apiKey: ollamaKey }; |
There was a problem hiding this comment.
[P1] Ollama discovery now runs even when explicitly configured; may break “explicit config disables auto-discovery” contract
resolveImplicitProviders() will always call buildOllamaProvider() (which hits http://127.0.0.1:11434/api/tags) before checking for any explicit OLLAMA_API_KEY/auth profile. If a user has explicitly configured Ollama for a remote host or expects discovery to be opt-in, this adds an unexpected localhost probe and can cause the implicit provider to appear/disappear based purely on local state. Consider gating discovery behind “no explicit provider entry” (or behind an explicit opt-in flag) to preserve the documented behavior.
Also appears inconsistent with docs in docs/providers/ollama.md that say implicit discovery is opt-in via OLLAMA_API_KEY and skipped when models.providers.ollama is explicitly set.
Prompt To Fix With AI
This is a comment left during a code review.
Path: src/agents/models-config.providers.ts
Line: 456:463
Comment:
[P1] **Ollama discovery now runs even when explicitly configured; may break “explicit config disables auto-discovery” contract**
`resolveImplicitProviders()` will always call `buildOllamaProvider()` (which hits `http://127.0.0.1:11434/api/tags`) before checking for any explicit `OLLAMA_API_KEY`/auth profile. If a user has explicitly configured Ollama for a remote host or expects discovery to be opt-in, this adds an unexpected localhost probe and can cause the implicit provider to appear/disappear based purely on local state. Consider gating discovery behind “no explicit provider entry” (or behind an explicit opt-in flag) to preserve the documented behavior.
Also appears inconsistent with docs in `docs/providers/ollama.md` that say implicit discovery is opt-in via `OLLAMA_API_KEY` and skipped when `models.providers.ollama` is explicitly set.
How can I resolve this? If you propose a fix, please make it concise.
Additional Comments (1)
Note: If this suggestion doesn't match your team's coding style, reply to this and let me know. I'll remember it for next time! Prompt To Fix With AIThis is a comment left during a code review.
Path: src/agents/models-config.providers.ts
Line: 100:110
Comment:
[P2] **Noisy warnings during normal “not running / no models” case**
`discoverOllamaModels()` logs warnings for expected situations (`!response.ok`, empty `models`). Since `resolveImplicitProviders()` now probes Ollama by default, these warnings can show up for users who don’t use Ollama (or haven’t pulled models yet) and make `openclaw models list` look unhealthy. Consider downgrading these to debug-level logging or only warning when the user explicitly enabled Ollama discovery.
<sub>Note: If this suggestion doesn't match your team's coding style, reply to this and let me know. I'll remember it for next time!</sub>
How can I resolve this? If you propose a fix, please make it concise. |
bfc1ccb to
f92900f
Compare
…4544) Ollama is a local service that doesn't require authentication, but the provider was only registered when OLLAMA_API_KEY was set. This caused locally installed models to show as "missing" in openclaw models list. Changes: - Always attempt Ollama model discovery via /api/tags - Register provider with placeholder key "ollama-local" when models found - Add quiet flag to suppress warnings during implicit auto-discovery, so users without Ollama won't see noisy console output - Preserve existing behavior when API key is explicitly configured Fixes openclaw#4544 Ref openclaw#4782
|
AI-assisted stale-fix triage closure. Why closing:
Result:
If still relevant:
|
|
Closed after AI-assisted stale-fix triage; see prior comment for details. |
…4544) Ollama is a local service that doesn't require authentication, but the provider was only registered when OLLAMA_API_KEY was set. This caused locally installed models to show as "missing" in openclaw models list. Changes: - Always attempt Ollama model discovery via /api/tags - Register provider with placeholder key "ollama-local" when models found - Add quiet flag to suppress warnings during implicit auto-discovery, so users without Ollama won't see noisy console output - Preserve existing behavior when API key is explicitly configured Fixes openclaw#4544 Ref openclaw#4782
Summary
Fixes #4544
When Ollama is running locally with models installed, they were showing as 'missing' in
openclaw models listbecause the provider was only registered if an explicit API key was configured.Root Cause
In
src/agents/models-config.providers.ts, theresolveImplicitProviders()function only added the Ollama provider when an API key was explicitly configured via environment variables or auth profiles.Since Ollama is a local service that doesn't require traditional API authentication (it runs on localhost:11434), this prevented auto-discovery of locally installed models.
Changes
models.length > 0), the provider is registered automatically'ollama-local'for local instancesTesting
Expected Behavior After Fix
ollama pull deepseek-r1:latestopenclaw models listopenclaw models set ollama/deepseek-r1:latestImpact
Note: This change makes OpenClaw's Ollama integration work the same way as other local model servers - auto-discover and register if running.
Greptile Overview
Greptile Summary
This PR changes implicit provider resolution to auto-register Ollama when a local Ollama instance responds with at least one installed model. It does so by always building the Ollama provider (which triggers
/api/tagsdiscovery againsthttp://127.0.0.1:11434) and, if any models are returned, registering the provider with either an explicit env/auth-profile key or a placeholderollama-localkey.This fits into the existing
resolveImplicitProviders()pattern, which opportunistically adds providers based on available credentials/profiles and discovery results. The main behavior change is that Ollama discovery is no longer strictly gated by explicit configuration; it can now be activated by local state alone.Confidence Score: 4/5
Context used:
dashboard- CLAUDE.md (source)dashboard- AGENTS.md (source)