-
-
Notifications
You must be signed in to change notification settings - Fork 53k
Closed
Description
Summary
The memory-lancedb plugin's openclaw.plugin.json schema restricts embedding.model to only two values:
"model": {
"type": "string",
"enum": ["text-embedding-3-small", "text-embedding-3-large"]
}However, the TypeScript source (extensions/memory-lancedb/config.ts) already supports gemini-embedding-001 in the EMBEDDING_DIMENSIONS map:
const EMBEDDING_DIMENSIONS: Record<string, number> = {
"text-embedding-3-small": 1536,
"text-embedding-3-large": 3072,
"gemini-embedding-001": 3072, // <-- supported in code, rejected by schema
};The config.ts also accepts baseUrl and apiKey in the embedding config, which allows using Google's OpenAI-compatible endpoint (https://generativelanguage.googleapis.com/v1beta/openai). But the JSON schema blocks baseUrl as an additional property.
Expected Behavior
The configSchema in openclaw.plugin.json should match the runtime capabilities:
"model": {
"type": "string",
"enum": ["text-embedding-3-small", "text-embedding-3-large", "gemini-embedding-001"]
}And baseUrl should be added to the embedding schema properties:
"embedding": {
"properties": {
"apiKey": { "type": "string" },
"model": { "type": "string", "enum": [...] },
"baseUrl": { "type": "string" }
}
}Workaround
Currently must manually patch openclaw.plugin.json after every npm install / openclaw update, which is fragile.
Working Config (after patch)
{
"plugins": {
"entries": {
"memory-lancedb": {
"enabled": true,
"config": {
"embedding": {
"apiKey": "<GEMINI_API_KEY>",
"model": "gemini-embedding-001",
"baseUrl": "https://generativelanguage.googleapis.com/v1beta/openai"
},
"autoCapture": true,
"autoRecall": true
}
}
},
"slots": {
"memory": "memory-lancedb"
}
}
}Environment
- OpenClaw version:
2026.2.14 - OS: Ubuntu 24.04 (Hetzner VPS)
- Embedding provider: Google Gemini via OpenAI-compatible API
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
No labels