Skip to content

fix: Auto-discover Ollama models without requiring explicit API key#4782

Closed
spiceoogway wants to merge 1 commit intoopenclaw:mainfrom
spiceoogway:fix/issue-4544-ollama-auto-discovery
Closed

fix: Auto-discover Ollama models without requiring explicit API key#4782
spiceoogway wants to merge 1 commit intoopenclaw:mainfrom
spiceoogway:fix/issue-4544-ollama-auto-discovery

Conversation

@spiceoogway
Copy link
Contributor

@spiceoogway spiceoogway commented Jan 30, 2026

Summary

Fixes #4544

When Ollama is running locally with models installed, they were showing as 'missing' in openclaw models list because the provider was only registered if an explicit API key was configured.

Root Cause

In src/agents/models-config.providers.ts, the resolveImplicitProviders() function only added the Ollama provider when an API key was explicitly configured via environment variables or auth profiles.

Since Ollama is a local service that doesn't require traditional API authentication (it runs on localhost:11434), this prevented auto-discovery of locally installed models.

Changes

  • Modified the Ollama provider registration logic to auto-discover models by attempting to query the local instance
  • If models are found (models.length > 0), the provider is registered automatically
  • Uses a placeholder API key 'ollama-local' for local instances
  • Still respects explicit configuration if provided (backward compatible)

Testing

  • ✅ Linted the changed file: no warnings or errors
  • ✅ Aligns with patterns used for other local/unauthenticated providers (e.g., qwen-portal with OAuth placeholder)

Expected Behavior After Fix

  1. User runs ollama pull deepseek-r1:latest
  2. User runs openclaw models list
  3. Ollama models are now listed as available (not missing)
  4. User can set default model: openclaw models set ollama/deepseek-r1:latest
  5. Chat can successfully use the Ollama model

Impact

  • Low risk: Only affects Ollama provider discovery
  • High value: Enables out-of-the-box Ollama support without manual configuration
  • Backward compatible: Existing configurations with explicit API keys still work

Note: This change makes OpenClaw's Ollama integration work the same way as other local model servers - auto-discover and register if running.

Greptile Overview

Greptile Summary

This PR changes implicit provider resolution to auto-register Ollama when a local Ollama instance responds with at least one installed model. It does so by always building the Ollama provider (which triggers /api/tags discovery against http://127.0.0.1:11434) and, if any models are returned, registering the provider with either an explicit env/auth-profile key or a placeholder ollama-local key.

This fits into the existing resolveImplicitProviders() pattern, which opportunistically adds providers based on available credentials/profiles and discovery results. The main behavior change is that Ollama discovery is no longer strictly gated by explicit configuration; it can now be activated by local state alone.

Confidence Score: 4/5

  • This PR is likely safe to merge, but it changes Ollama discovery semantics and may introduce noisy localhost probing/logging in common cases.
  • The code change is small and localized, but it alters when discovery runs (potentially affecting users who rely on opt-in behavior or explicit config) and can emit warnings during normal “Ollama not running/no models” scenarios.
  • src/agents/models-config.providers.ts

Context used:

  • Context from dashboard - CLAUDE.md (source)
  • Context from dashboard - AGENTS.md (source)

@openclaw-barnacle openclaw-barnacle bot added channel: nostr Channel integration: nostr agents Agent runtime and tooling labels Jan 30, 2026
@spiceoogway spiceoogway force-pushed the fix/issue-4544-ollama-auto-discovery branch 2 times, most recently from a893a43 to 0ea97fc Compare January 30, 2026 17:25
@openclaw-barnacle openclaw-barnacle bot removed the channel: nostr Channel integration: nostr label Jan 30, 2026
@spiceoogway spiceoogway force-pushed the fix/issue-4544-ollama-auto-discovery branch from 0ea97fc to 8ecc1c6 Compare January 30, 2026 17:26
Fixes openclaw#4544

When Ollama is running locally with models installed, they were showing
as 'missing' in 'openclaw models list' because the provider was only
registered if an explicit API key was configured.

Since Ollama is a local service that doesn't require authentication,
this change:
- Auto-discovers Ollama models by attempting to query the local instance
- Registers the provider if models are found
- Uses a placeholder API key 'ollama-local' for local instances
- Still respects explicit configuration if provided

This aligns with how other local/unauthenticated providers work
(e.g., qwen-portal with OAuth placeholder).

Users can now use Ollama models immediately after installing Ollama
and pulling models, without needing to configure authentication.
Copy link
Contributor

@greptile-apps greptile-apps bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

1 file reviewed, 2 comments

Edit Code Review Agent Settings | Greptile

Comment on lines +456 to +463
// Ollama provider - auto-discover if running locally
const ollamaProvider = await buildOllamaProvider();
if (ollamaProvider.models.length > 0) {
const ollamaKey =
resolveEnvApiKeyVarName("ollama") ??
resolveApiKeyFromProfiles({ provider: "ollama", store: authStore }) ??
"ollama-local"; // Placeholder for local Ollama instance
providers.ollama = { ...ollamaProvider, apiKey: ollamaKey };
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[P1] Ollama discovery now runs even when explicitly configured; may break “explicit config disables auto-discovery” contract

resolveImplicitProviders() will always call buildOllamaProvider() (which hits http://127.0.0.1:11434/api/tags) before checking for any explicit OLLAMA_API_KEY/auth profile. If a user has explicitly configured Ollama for a remote host or expects discovery to be opt-in, this adds an unexpected localhost probe and can cause the implicit provider to appear/disappear based purely on local state. Consider gating discovery behind “no explicit provider entry” (or behind an explicit opt-in flag) to preserve the documented behavior.

Also appears inconsistent with docs in docs/providers/ollama.md that say implicit discovery is opt-in via OLLAMA_API_KEY and skipped when models.providers.ollama is explicitly set.

Prompt To Fix With AI
This is a comment left during a code review.
Path: src/agents/models-config.providers.ts
Line: 456:463

Comment:
[P1] **Ollama discovery now runs even when explicitly configured; may break “explicit config disables auto-discovery” contract**

`resolveImplicitProviders()` will always call `buildOllamaProvider()` (which hits `http://127.0.0.1:11434/api/tags`) before checking for any explicit `OLLAMA_API_KEY`/auth profile. If a user has explicitly configured Ollama for a remote host or expects discovery to be opt-in, this adds an unexpected localhost probe and can cause the implicit provider to appear/disappear based purely on local state. Consider gating discovery behind “no explicit provider entry” (or behind an explicit opt-in flag) to preserve the documented behavior.

Also appears inconsistent with docs in `docs/providers/ollama.md` that say implicit discovery is opt-in via `OLLAMA_API_KEY` and skipped when `models.providers.ollama` is explicitly set.

How can I resolve this? If you propose a fix, please make it concise.

@greptile-apps
Copy link
Contributor

greptile-apps bot commented Feb 3, 2026

Additional Comments (1)

src/agents/models-config.providers.ts
[P2] Noisy warnings during normal “not running / no models” case

discoverOllamaModels() logs warnings for expected situations (!response.ok, empty models). Since resolveImplicitProviders() now probes Ollama by default, these warnings can show up for users who don’t use Ollama (or haven’t pulled models yet) and make openclaw models list look unhealthy. Consider downgrading these to debug-level logging or only warning when the user explicitly enabled Ollama discovery.

Note: If this suggestion doesn't match your team's coding style, reply to this and let me know. I'll remember it for next time!

Prompt To Fix With AI
This is a comment left during a code review.
Path: src/agents/models-config.providers.ts
Line: 100:110

Comment:
[P2] **Noisy warnings during normal “not running / no models” case**

`discoverOllamaModels()` logs warnings for expected situations (`!response.ok`, empty `models`). Since `resolveImplicitProviders()` now probes Ollama by default, these warnings can show up for users who don’t use Ollama (or haven’t pulled models yet) and make `openclaw models list` look unhealthy. Consider downgrading these to debug-level logging or only warning when the user explicitly enabled Ollama discovery.

<sub>Note: If this suggestion doesn't match your team's coding style, reply to this and let me know. I'll remember it for next time!</sub>

How can I resolve this? If you propose a fix, please make it concise.

marcodelpin added a commit to marcodelpin/openclaw that referenced this pull request Feb 19, 2026
…4544)

Ollama is a local service that doesn't require authentication, but the
provider was only registered when OLLAMA_API_KEY was set. This caused
locally installed models to show as "missing" in openclaw models list.

Changes:
- Always attempt Ollama model discovery via /api/tags
- Register provider with placeholder key "ollama-local" when models found
- Add quiet flag to suppress warnings during implicit auto-discovery,
  so users without Ollama won't see noisy console output
- Preserve existing behavior when API key is explicitly configured

Fixes openclaw#4544
Ref openclaw#4782
@steipete
Copy link
Contributor

AI-assisted stale-fix triage closure.

Why closing:

Result:

If still relevant:

@steipete
Copy link
Contributor

Closed after AI-assisted stale-fix triage; see prior comment for details.

@steipete steipete closed this Feb 24, 2026
marcodelpin added a commit to marcodelpin/openclaw that referenced this pull request Feb 26, 2026
…4544)

Ollama is a local service that doesn't require authentication, but the
provider was only registered when OLLAMA_API_KEY was set. This caused
locally installed models to show as "missing" in openclaw models list.

Changes:
- Always attempt Ollama model discovery via /api/tags
- Register provider with placeholder key "ollama-local" when models found
- Add quiet flag to suppress warnings during implicit auto-discovery,
  so users without Ollama won't see noisy console output
- Preserve existing behavior when API key is explicitly configured

Fixes openclaw#4544
Ref openclaw#4782
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

agents Agent runtime and tooling

Projects

None yet

Development

Successfully merging this pull request may close these issues.

[Bug]: Cannot change model to Ollama Deepseek-r1:latest

2 participants