feat(memory-lancedb): support Ollama and OpenAI-compatible embedding endpoints#17030
Closed
nightfullstar wants to merge 2 commits intoopenclaw:mainfrom
Closed
feat(memory-lancedb): support Ollama and OpenAI-compatible embedding endpoints#17030nightfullstar wants to merge 2 commits intoopenclaw:mainfrom
nightfullstar wants to merge 2 commits intoopenclaw:mainfrom
Conversation
… Ollama - Updated MemoryConfig to include optional baseUrl and dimensions for embedding. - Modified vectorDimsForModel to accept explicit dimensions. - Refactored resolveEmbeddingConfig to validate and resolve embedding settings. - Adjusted Embeddings class to support OpenAI-compatible endpoints, including Ollama. - Updated tests to validate new configuration options and ensure proper error handling for missing apiKey when using OpenAI.
Contributor
Additional Comments (1)
Prompt To Fix With AIThis is a comment left during a code review.
Path: extensions/memory-lancedb/package.json
Line: 10:10
Comment:
`openai` dependency should be removed since it's no longer imported/used in `index.ts`
```suggestion
"@sinclair/typebox": "0.34.48"
```
How can I resolve this? If you propose a fix, please make it concise. |
bfc1ccb to
f92900f
Compare
Contributor
Additional Comments (1)
Prompt To Fix With AIThis is a comment left during a code review.
Path: extensions/memory-lancedb/package.json
Line: 10:10
Comment:
`openai` dependency no longer used (replaced with `fetch` in `index.ts:176-194`), can be removed
```suggestion
}
```
How can I resolve this? If you propose a fix, please make it concise. |
|
This pull request has been automatically marked as stale due to inactivity. |
Member
|
Thanks for pushing this through. I am closing this as a duplicate of #17874, which is now merged and covers the same memory-lancedb OpenAI-compatible embeddings path. Your contribution is still part of the history for this area. If there is behavior here that #17874 missed, reply here and we can reopen quickly. |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Summary
The memory-lancedb plugin previously required an OpenAI API key and only talked to OpenAI’s embeddings API. This PR adds support for Ollama and any other service that exposes the OpenAI-compatible
/v1/embeddingsAPI (e.g. local or self-hosted endpoints), so you can run long-term memory without an OpenAI key.Scope
config.ts):embedding.apiKeyis now optional whenembedding.baseUrlis set (e.g. Ollama). Addedembedding.baseUrl(defaulthttps://api.openai.com/v1) andembedding.dimensions(for models not in the built-in list). Built-in dimensions extended withnomic-embed-text(768) andmxbai-embed-large(1024).index.ts): Replaced the hardcodedopenaiclient with a smallfetch-based client that usesbaseUrland only sendsAuthorizationwhenapiKeyis set.index.test.ts): Updated “missing apiKey” assertion to the new error message; added test that Ollama-style config (baseUrl + model, no apiKey) parses correctly.OLLAMA-SUPPORT.mddescribing the change and how to use the plugin with Ollama.User-facing changes
plugins.entries.memory-lancedb.config.embedding.baseUrl(e.g.http://localhost:11434/v1for Ollama),embedding.dimensions(required for unknown models).baseUrlis set to a local/OpenAI-compatible endpoint (e.g. Ollama),embedding.apiKeycan be omitted.baseUrl(or leave default) and setembedding.apiKeyas before.Testing
pnpm test -- extensions/memory-lancedb/index.test.ts— all tests pass (including new Ollama config test).pnpm check(no new issues).Example (Ollama)
{ "plugins": { "entries": { "memory-lancedb": { "config": { "embedding": { "baseUrl": "http://localhost:11434/v1", "model": "nomic-embed-text" }, "dbPath": "~/.openclaw/memory/lancedb", "autoCapture": true, "autoRecall": true } } }, "slots": { "memory": "memory-lancedb" } } }No
apiKeyrequired; runollama pull nomic-embed-textand ensure Ollama is serving.Greptile Summary
Adds support for Ollama and OpenAI-compatible embedding endpoints to the memory-lancedb plugin, allowing users to run long-term memory without an OpenAI API key. The implementation replaces the hardcoded OpenAI client with a generic
fetch-based client that conditionally sends authorization headers when an API key is provided.Changes made:
config.ts: Madeembedding.apiKeyoptional whenembedding.baseUrlis set, addedembedding.baseUrlandembedding.dimensionsconfig options, extended built-in dimension mappings for Ollama models (nomic-embed-text,mxbai-embed-large)index.ts: Replaced OpenAI SDK with lightweightfetchimplementation that works with any OpenAI-compatible/v1/embeddingsendpointindex.test.ts: Updated test assertions to match new error messages and added validation for Ollama configurationIssues found:
openaipackage dependency is still present inpackage.jsonbut is no longer imported or used in the codeConfidence Score: 4/5
openaidependency from package.json.extensions/memory-lancedb/package.jsonto remove unused dependencyLast reviewed commit: 8baa707