feat: add Ollama model provider support#575
Conversation
|
No actionable comments were generated in the recent review. 🎉 ℹ️ Recent review info⚙️ Run configurationConfiguration used: defaults Review profile: CHILL Plan: Pro Run ID: 📒 Files selected for processing (2)
🚧 Files skipped from review as they are similar to previous changes (2)
📝 WalkthroughWalkthroughAdds Ollama as a BYOK provider across API schemas, backend services (upsert/verify), OpenClaw config/runtime resolution, frontend UI/types/i18n/branding, and tests; includes provider-specific handling (dummy key, model discovery) and a model-refresh flow. Changes
Sequence Diagram(s)sequenceDiagram
participant Web as Web UI
participant Controller as Controller API
participant Service as ModelProviderService
participant Ollama as Local Ollama
participant Config as NexuConfigStore
Web->>Controller: POST /api/v1/providers/ollama/verify { baseUrl, apiKey? }
Controller->>Service: verifyProvider("ollama", input)
Service->>Ollama: GET {baseUrl}/api/tags (Authorization if provided)
Ollama-->>Service: 200 [{"name":"modelA"}, {"name":"modelB"}]
Service-->>Controller: { valid: true, models: ["modelA","modelB"] }
Controller-->>Web: 200 { valid: true, models: [...] }
alt User saves provider
Web->>Controller: PUT /api/v1/providers/ollama { baseUrl, apiKey? }
Controller->>Service: upsertProvider(...)
Service->>Config: persist provider (apiKey -> "ollama-local" if missing)
Config-->>Service: saved
Service-->>Controller: 200 OK
Controller-->>Web: 200 OK
end
Estimated code review effort🎯 3 (Moderate) | ⏱️ ~25 minutes Possibly related PRs
Suggested reviewers
Poem
🚥 Pre-merge checks | ✅ 4 | ❌ 1❌ Failed checks (1 warning)
✅ Passed checks (4 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. ✨ Finishing Touches📝 Generate docstrings
🧪 Generate unit tests (beta)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
There was a problem hiding this comment.
💡 Codex Review
Here are some automated review suggestions for this pull request.
Reviewed commit: f1606325ed
ℹ️ About Codex in GitHub
Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you
- Open a pull request for review
- Mark a draft as ready
- Comment "@codex review".
If Codex has suggestions, it will comment; otherwise it will react with 👍.
Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".
There was a problem hiding this comment.
Actionable comments posted: 1
🧹 Nitpick comments (2)
apps/controller/src/services/model-provider-service.ts (1)
498-523: Consider supporting auth headers for secured Ollama instances.The verification request doesn't include any authentication headers, but
input.apiKeyis available. If a user configures a secured Ollama instance (e.g., behind a reverse proxy with API key auth), the verification will fail even with valid credentials.Consider adding optional auth support:
💡 Suggested enhancement
if (providerId === "ollama") { + const headers: Record<string, string> = {}; + if (input.apiKey && input.apiKey !== OLLAMA_DUMMY_API_KEY) { + headers.Authorization = `Bearer ${input.apiKey}`; + } const response = await fetch( buildProviderUrl( input.baseUrl ?? PROVIDER_BASE_URLS[providerId] ?? null, "/api/tags", ) ?? verifyUrl, { + headers: Object.keys(headers).length > 0 ? headers : undefined, signal: AbortSignal.timeout(10000), }, );🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@apps/controller/src/services/model-provider-service.ts` around lines 498 - 523, The Ollama verification branch (providerId === "ollama" inside validate/check routine) currently calls fetch(buildProviderUrl(...) ?? verifyUrl, ...) without any auth headers; update the request to include authentication when input.apiKey is present—add an Authorization (e.g., Bearer) or X-API-Key header as appropriate to the fetch options so secured Ollama instances behind API-key auth can be validated; keep using buildProviderUrl and preserve the AbortSignal.timeout behavior and error handling.apps/web/src/pages/models.tsx (1)
1352-1354: Consider extracting the placeholder constant.The hardcoded
"ollama-local"string is used as a dummy API key for Ollama (which doesn't require authentication). This value propagates to the backend and database. Consider extracting it to a named constant for clarity and to avoid magic strings.💡 Suggested refactor
+const OLLAMA_PLACEHOLDER_KEY = "ollama-local"; + function ByokProviderDetail({ // ... }) { // ... const isOllama = providerId === "ollama"; const hostBridge = getModelsHostInvokeBridge(); - const effectiveApiKey = isOllama ? "ollama-local" : apiKey; + const effectiveApiKey = isOllama ? OLLAMA_PLACEHOLDER_KEY : apiKey;🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@apps/web/src/pages/models.tsx` around lines 1352 - 1354, Replace the magic string "ollama-local" used when computing effectiveApiKey with a named constant (e.g., OLLAMA_DUMMY_API_KEY or OLLAMA_LOCAL_API_KEY); update the code that sets effectiveApiKey (the isOllama check and assignment) to use that constant and move the constant to an appropriate scope/file (same module or shared constants) so it’s clear this is a deliberate dummy key and can be reused wherever isOllama or effectiveApiKey logic is referenced.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@apps/controller/src/lib/openclaw-config-compiler.ts`:
- Around line 70-71: In the switch mapping that handles provider types (the
branch matching case "ollama" in openclaw-config-compiler.ts), change the
returned API type from "ollama" to "openai-completions" so Ollama is treated as
the OpenAI-compatible completions API; update the return value in the case
"ollama" branch to "openai-completions".
---
Nitpick comments:
In `@apps/controller/src/services/model-provider-service.ts`:
- Around line 498-523: The Ollama verification branch (providerId === "ollama"
inside validate/check routine) currently calls fetch(buildProviderUrl(...) ??
verifyUrl, ...) without any auth headers; update the request to include
authentication when input.apiKey is present—add an Authorization (e.g., Bearer)
or X-API-Key header as appropriate to the fetch options so secured Ollama
instances behind API-key auth can be validated; keep using buildProviderUrl and
preserve the AbortSignal.timeout behavior and error handling.
In `@apps/web/src/pages/models.tsx`:
- Around line 1352-1354: Replace the magic string "ollama-local" used when
computing effectiveApiKey with a named constant (e.g., OLLAMA_DUMMY_API_KEY or
OLLAMA_LOCAL_API_KEY); update the code that sets effectiveApiKey (the isOllama
check and assignment) to use that constant and move the constant to an
appropriate scope/file (same module or shared constants) so it’s clear this is a
deliberate dummy key and can be reused wherever isOllama or effectiveApiKey
logic is referenced.
🪄 Autofix (Beta)
Fix all unresolved CodeRabbit comments on this PR:
- Push a commit to this branch (recommended)
- Create a new PR with the fixes
ℹ️ Review info
⚙️ Run configuration
Configuration used: defaults
Review profile: CHILL
Plan: Pro
Run ID: 193f302a-bc7a-41d8-b4d3-d9478325e2ba
📒 Files selected for processing (13)
apps/controller/openapi.jsonapps/controller/src/lib/byok-providers.tsapps/controller/src/lib/openclaw-config-compiler.tsapps/controller/src/routes/desktop-compat-routes.tsapps/controller/src/services/model-provider-service.tsapps/web/lib/api/types.gen.tsapps/web/src/components/provider-logo.tsxapps/web/src/i18n/locales/en.tsapps/web/src/i18n/locales/zh-CN.tsapps/web/src/pages/models.tsxtests/desktop/model-provider-service.test.tstests/desktop/openclaw-config-compiler.test.tstests/web/models-selection.test.ts
There was a problem hiding this comment.
💡 Codex Review
Here are some automated review suggestions for this pull request.
Reviewed commit: 703b319e62
ℹ️ About Codex in GitHub
Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you
- Open a pull request for review
- Mark a draft as ready
- Comment "@codex review".
If Codex has suggestions, it will comment; otherwise it will react with 👍.
Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".
There was a problem hiding this comment.
Actionable comments posted: 1
🧹 Nitpick comments (4)
apps/controller/src/runtime/openclaw-process.ts (1)
22-38: Consider extractingfindWorkspaceRoot()to a shared utility.This function is duplicated verbatim in
apps/controller/src/services/model-provider-service.ts(lines 181-194). Both files use identical logic to discover the monorepo root for openclaw entry path resolution.Extracting this to a shared module (e.g.,
lib/workspace.tsorlib/paths.ts) would reduce duplication and ensure both call sites stay in sync.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@apps/controller/src/runtime/openclaw-process.ts` around lines 22 - 38, Extract the duplicated findWorkspaceRoot logic into a shared utility module (e.g., create lib/workspace.ts or lib/paths.ts) that exports a single function named findWorkspaceRoot(startDir: string): string | null; replace the inline implementations in openclaw-process.ts and model-provider-service.ts with imports from that new module and call the exported findWorkspaceRoot to preserve behavior; ensure the new module imports path and existsSync and re-exports the function signature unchanged so both call sites (findWorkspaceRoot in openclaw-process.ts and the duplicate in model-provider-service.ts) require no other changes.tests/desktop/openclaw-config-compiler.test.ts (1)
292-292: Misleading test name: "native ollama API" vs "openai-completions".The test name suggests Ollama is compiled with its "native" API, but the assertion expects
api: "openai-completions"(Line 316). While Ollama does expose an OpenAI-compatible endpoint, calling it "native" may confuse future maintainers. Consider a clearer name:- it("compiles ollama providers with the native ollama API", () => { + it("compiles ollama providers using the openai-completions API", () => {🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@tests/desktop/openclaw-config-compiler.test.ts` at line 292, The test name is misleading: change the "it" description that currently reads 'compiles ollama providers with the native ollama API' to clearly indicate the OpenAI-compatible endpoint (e.g., 'compiles ollama providers with the OpenAI-compatible API') or alternatively update the expectation to match true native behavior; locate the test starting with it("compiles ollama providers with the native ollama API", () => { and adjust the string to reference "openai-completions" (or make the assertion match the native Ollama API) so the test name and the assertion api: "openai-completions" are consistent.apps/web/src/pages/models.tsx (2)
2177-2196: Consider extracting the verification result display.This JSX block (lines 2177-2196) duplicates the verification feedback from lines 2120-2139. Consider extracting to a small component or helper to reduce duplication.
💡 Example extraction
function VerifyResultFeedback({ result, t, }: { result: { valid: boolean; models?: string[]; error?: string } | undefined; t: (key: string, options?: Record<string, unknown>) => string; }) { if (!result) return null; return ( <div className={cn( "mt-1.5 text-[10px]", result.valid ? "text-emerald-600" : "text-red-500", )} > {result.valid ? t("models.byok.keyValid", { count: result.models?.length ?? 0 }) : t("models.byok.keyInvalid", { error: result.error ?? t("models.byok.keyInvalidUnknown"), })} </div> ); }🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@apps/web/src/pages/models.tsx` around lines 2177 - 2196, Extract the duplicated JSX that renders verification feedback into a small reusable component (e.g., VerifyResultFeedback) and use it in both places where the block currently appears; the component should accept the verify result (verifyMutation.data or similar) and the translation function t, return null if no result, apply the same cn styling logic ("mt-1.5 text-[10px]" + result.valid ? "text-emerald-600" : "text-red-500"), and render the same translation calls (t("models.byok.keyValid", { count: result.models?.length ?? 0 }) or t("models.byok.keyInvalid", { error: result.error ?? t("models.byok.keyInvalidUnknown") })) so both instances (the JSX using verifyMutation and the other duplicated block) simply render <VerifyResultFeedback result={verifyMutation.data} t={t} />.
1543-1573: Consider adding error feedback forrefreshModelsMutation.The mutation throws on verification failure but lacks
onError/onSuccesshandlers to show toast feedback to the user. Other mutations in this file (e.g.,saveMutation,verifyMutation) handle errors more gracefully or show success feedback.💡 Suggested improvement
const refreshModelsMutation = useMutation({ mutationFn: async () => { // ... existing code ... return models; }, + onSuccess: () => { + toast.success(t("models.byok.refreshSuccess")); + }, + onError: (error) => { + toast.error( + error instanceof Error + ? error.message + : t("models.byok.refreshFailed"), + ); + }, });🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@apps/web/src/pages/models.tsx` around lines 1543 - 1573, The refreshModelsMutation currently throws on verification failure but has no onError/onSuccess handlers to surface toast feedback; update the useMutation call for refreshModelsMutation to add onSuccess and onError callbacks that (1) show success toasts when models are loaded (using the returned models and e.g., t messages), call setVerifiedModels as needed, and invalidate queries via queryClient.invalidateQueries (["providers"], ["models"]) after saveProvider, and (2) show error toasts when verifyApiKey returns invalid or when saveProvider fails (capture and log the error), ensuring you reference refreshModelsMutation, verifyApiKey, saveProvider, setVerifiedModels, and queryClient.invalidateQueries in the handlers.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@apps/web/src/pages/models.tsx`:
- Around line 2267-2293: The refresh button is wrongly disabled when a saved
provider key exists but apiKey is empty; update the disabled and className
condition to allow clicking when either isOllama, apiKey, or
dbProvider?.hasApiKey is true (replace the current (!isOllama && !apiKey) /
(isOllama || apiKey) checks). Also adjust refreshModelsMutation invocation/logic
(and any place computing effectiveApiKey) so that when apiKey is empty it falls
back to dbProvider?.hasApiKey / the saved key for non-Ollama providers (ensure
symbols: apiKey, dbProvider?.hasApiKey, isOllama, refreshModelsMutation,
effectiveApiKey are updated accordingly).
---
Nitpick comments:
In `@apps/controller/src/runtime/openclaw-process.ts`:
- Around line 22-38: Extract the duplicated findWorkspaceRoot logic into a
shared utility module (e.g., create lib/workspace.ts or lib/paths.ts) that
exports a single function named findWorkspaceRoot(startDir: string): string |
null; replace the inline implementations in openclaw-process.ts and
model-provider-service.ts with imports from that new module and call the
exported findWorkspaceRoot to preserve behavior; ensure the new module imports
path and existsSync and re-exports the function signature unchanged so both call
sites (findWorkspaceRoot in openclaw-process.ts and the duplicate in
model-provider-service.ts) require no other changes.
In `@apps/web/src/pages/models.tsx`:
- Around line 2177-2196: Extract the duplicated JSX that renders verification
feedback into a small reusable component (e.g., VerifyResultFeedback) and use it
in both places where the block currently appears; the component should accept
the verify result (verifyMutation.data or similar) and the translation function
t, return null if no result, apply the same cn styling logic ("mt-1.5
text-[10px]" + result.valid ? "text-emerald-600" : "text-red-500"), and render
the same translation calls (t("models.byok.keyValid", { count:
result.models?.length ?? 0 }) or t("models.byok.keyInvalid", { error:
result.error ?? t("models.byok.keyInvalidUnknown") })) so both instances (the
JSX using verifyMutation and the other duplicated block) simply render
<VerifyResultFeedback result={verifyMutation.data} t={t} />.
- Around line 1543-1573: The refreshModelsMutation currently throws on
verification failure but has no onError/onSuccess handlers to surface toast
feedback; update the useMutation call for refreshModelsMutation to add onSuccess
and onError callbacks that (1) show success toasts when models are loaded (using
the returned models and e.g., t messages), call setVerifiedModels as needed, and
invalidate queries via queryClient.invalidateQueries (["providers"], ["models"])
after saveProvider, and (2) show error toasts when verifyApiKey returns invalid
or when saveProvider fails (capture and log the error), ensuring you reference
refreshModelsMutation, verifyApiKey, saveProvider, setVerifiedModels, and
queryClient.invalidateQueries in the handlers.
In `@tests/desktop/openclaw-config-compiler.test.ts`:
- Line 292: The test name is misleading: change the "it" description that
currently reads 'compiles ollama providers with the native ollama API' to
clearly indicate the OpenAI-compatible endpoint (e.g., 'compiles ollama
providers with the OpenAI-compatible API') or alternatively update the
expectation to match true native behavior; locate the test starting with
it("compiles ollama providers with the native ollama API", () => { and adjust
the string to reference "openai-completions" (or make the assertion match the
native Ollama API) so the test name and the assertion api: "openai-completions"
are consistent.
🪄 Autofix (Beta)
Fix all unresolved CodeRabbit comments on this PR:
- Push a commit to this branch (recommended)
- Create a new PR with the fixes
ℹ️ Review info
⚙️ Run configuration
Configuration used: defaults
Review profile: CHILL
Plan: Pro
Run ID: 6c115cd2-b32f-4be2-9a95-2eccc670733e
📒 Files selected for processing (6)
apps/controller/src/lib/openclaw-config-compiler.tsapps/controller/src/routes/desktop-compat-routes.tsapps/controller/src/runtime/openclaw-process.tsapps/controller/src/services/model-provider-service.tsapps/web/src/pages/models.tsxtests/desktop/openclaw-config-compiler.test.ts
✅ Files skipped from review due to trivial changes (1)
- apps/controller/src/lib/openclaw-config-compiler.ts
🚧 Files skipped from review as they are similar to previous changes (2)
- apps/controller/src/services/model-provider-service.ts
- apps/controller/src/routes/desktop-compat-routes.ts
There was a problem hiding this comment.
💡 Codex Review
Here are some automated review suggestions for this pull request.
Reviewed commit: d51b9f3114
ℹ️ About Codex in GitHub
Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you
- Open a pull request for review
- Mark a draft as ready
- Comment "@codex review".
If Codex has suggestions, it will comment; otherwise it will react with 👍.
Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".
What
Add Ollama as a model provider and fix model selection so saved defaults stay compatible and remain synced in the UI.
Closes #565
Why
Users need to configure Ollama models alongside existing providers, and the model picker was not reliably reflecting saved defaults or legacy default IDs.
How
Affected areas
Checklist
pnpm typecheckpassespnpm lintpassespnpm testpassespnpm generate-typesrun (if API routes/schemas changed)anytypes introduced (useunknownwith narrowing)Notes for reviewers
Please focus on the saved-default migration path and the model picker sync behavior for legacy provider IDs.
Summary by CodeRabbit
New Features
Tests