Summary
I'd like to use OpenRouter for memory search embeddings (via Google's text-embedding-004 or similar models through OpenRouter).
Current Behavior
When configuring memorySearch with OpenRouter:
agents: {
defaults: {
memorySearch: {
provider: "openai",
model: "google/text-embedding-004",
remote: {
baseUrl: "https://openrouter.ai/api/v1/",
apiKey: "sk-or-v1-..."
}
}
}
}
The CLI commands (clawdbot memory index and clawdbot memory search) work correctly, but the in-session memory_search tool doesn't respect the remote.baseUrl configuration and tries to hit OpenAI's API directly.
Expected Behavior
The memory_search tool should use the remote.baseUrl and remote.apiKey from config, allowing OpenRouter (or any OpenAI-compatible endpoint) to work for embeddings.
Workaround
Using Ollama locally works since it exposes an OpenAI-compatible /v1/embeddings endpoint:
memorySearch: {
provider: "openai",
model: "nomic-embed-text",
remote: {
baseUrl: "http://localhost:11434/v1/",
apiKey: "ollama"
}
}
Environment
- Clawdbot version: 2026.1.23-1
- macOS Darwin 24.1.0 (arm64)
Possible Cause
Looking at the code, embeddings-openai.ts properly reads remote.baseUrl and remote.apiKey, but there may be a caching issue where existing sessions don't pick up config changes. The INDEX_CACHE in manager.ts uses JSON.stringify(settings) as part of the cache key, but changes may not propagate to already-running sessions.
Summary
I'd like to use OpenRouter for memory search embeddings (via Google's
text-embedding-004or similar models through OpenRouter).Current Behavior
When configuring
memorySearchwith OpenRouter:The CLI commands (
clawdbot memory indexandclawdbot memory search) work correctly, but the in-sessionmemory_searchtool doesn't respect theremote.baseUrlconfiguration and tries to hit OpenAI's API directly.Expected Behavior
The
memory_searchtool should use theremote.baseUrlandremote.apiKeyfrom config, allowing OpenRouter (or any OpenAI-compatible endpoint) to work for embeddings.Workaround
Using Ollama locally works since it exposes an OpenAI-compatible
/v1/embeddingsendpoint:Environment
Possible Cause
Looking at the code,
embeddings-openai.tsproperly readsremote.baseUrlandremote.apiKey, but there may be a caching issue where existing sessions don't pick up config changes. TheINDEX_CACHEinmanager.tsusesJSON.stringify(settings)as part of the cache key, but changes may not propagate to already-running sessions.