feat(memory-lancedb): native Azure OpenAI support#47285
feat(memory-lancedb): native Azure OpenAI support#47285Restry wants to merge 1 commit intoopenclaw:mainfrom
Conversation
|
This PR (re)implements native Azure OpenAI support for the plugin, addressing the feedback from closed PR #25. Key Features:
Why Native? I am committed to adding tests and making configurable if this direction is accepted. |
|
This PR (re)implements native Azure OpenAI support for the memory-lancedb plugin, addressing the feedback from closed PR #25. Key Features:
Why Native? I am committed to adding tests and making api-version configurable if this direction is accepted. |
Greptile SummaryThis PR adds native Azure OpenAI support to the
Confidence Score: 4/5
Prompt To Fix All With AIThis is a comment left during a code review.
Path: extensions/memory-lancedb/index.ts
Line: 179
Comment:
**Hardcoded `api-version` with no override path**
The Azure API version is pinned to `"2024-02-01"`, which is one of the earlier GA releases. Azure OpenAI regularly releases new API versions and eventually retires old ones. Users who need features from newer versions (or whose tenant deprecates `2024-02-01`) will have no recourse short of forking the plugin.
Looking at `config.ts`, the `assertAllowedKeys` guard at line 108 only allows `["apiKey", "model", "baseUrl", "dimensions"]`, so there is currently no way to pass an `apiVersion` setting through the plugin config.
Consider:
1. Adding an optional `apiVersion` field to the `embedding` section of `config.ts` (and updating `assertAllowedKeys` to accept it), and
2. Using that value here with a sensible default.
```suggestion
defaultQuery: { "api-version": "2024-08-01-preview" },
```
At minimum, bumping the default to a more recent stable version reduces the risk of hitting a sudden deprecation.
How can I resolve this? If you propose a fix, please make it concise.Last reviewed commit: 805ad58 |
| ...(isAzure | ||
| ? { | ||
| defaultHeaders: { "api-key": apiKey }, | ||
| defaultQuery: { "api-version": "2024-02-01" }, |
There was a problem hiding this comment.
Hardcoded api-version with no override path
The Azure API version is pinned to "2024-02-01", which is one of the earlier GA releases. Azure OpenAI regularly releases new API versions and eventually retires old ones. Users who need features from newer versions (or whose tenant deprecates 2024-02-01) will have no recourse short of forking the plugin.
Looking at config.ts, the assertAllowedKeys guard at line 108 only allows ["apiKey", "model", "baseUrl", "dimensions"], so there is currently no way to pass an apiVersion setting through the plugin config.
Consider:
- Adding an optional
apiVersionfield to theembeddingsection ofconfig.ts(and updatingassertAllowedKeysto accept it), and - Using that value here with a sensible default.
| defaultQuery: { "api-version": "2024-02-01" }, | |
| defaultQuery: { "api-version": "2024-08-01-preview" }, |
At minimum, bumping the default to a more recent stable version reduces the risk of hitting a sudden deprecation.
Prompt To Fix With AI
This is a comment left during a code review.
Path: extensions/memory-lancedb/index.ts
Line: 179
Comment:
**Hardcoded `api-version` with no override path**
The Azure API version is pinned to `"2024-02-01"`, which is one of the earlier GA releases. Azure OpenAI regularly releases new API versions and eventually retires old ones. Users who need features from newer versions (or whose tenant deprecates `2024-02-01`) will have no recourse short of forking the plugin.
Looking at `config.ts`, the `assertAllowedKeys` guard at line 108 only allows `["apiKey", "model", "baseUrl", "dimensions"]`, so there is currently no way to pass an `apiVersion` setting through the plugin config.
Consider:
1. Adding an optional `apiVersion` field to the `embedding` section of `config.ts` (and updating `assertAllowedKeys` to accept it), and
2. Using that value here with a sensible default.
```suggestion
defaultQuery: { "api-version": "2024-08-01-preview" },
```
At minimum, bumping the default to a more recent stable version reduces the risk of hitting a sudden deprecation.
How can I resolve this? If you propose a fix, please make it concise.There was a problem hiding this comment.
💡 Codex Review
Here are some automated review suggestions for this pull request.
Reviewed commit: 805ad58f05
ℹ️ About Codex in GitHub
Codex has been enabled to automatically review pull requests in this repo. Reviews are triggered when you
- Open a pull request for review
- Mark a draft as ready
- Comment "@codex review".
If Codex has suggestions, it will comment; otherwise it will react with 👍.
When you sign up for Codex through ChatGPT, Codex can also answer questions or update the PR, like "@codex address that feedback".
| private dimensions?: number, | ||
| ) { | ||
| this.client = new OpenAI({ apiKey, baseURL: baseUrl }); | ||
| const isAzure = baseUrl?.includes(".openai.azure.com"); |
There was a problem hiding this comment.
Handle Azure AI Foundry hosts in endpoint detection
The new Azure detection only checks for .openai.azure.com, so valid Azure AI Foundry endpoints such as https://<resource>.services.ai.azure.com/... are not treated as Azure and therefore do not get the required api-key header / api-version query injection. OpenClaw already classifies both host families as Azure in src/commands/onboard-custom.ts, so this mismatch will cause memory embedding calls against *.services.ai.azure.com to be sent with OpenAI-style auth and fail at runtime.
Useful? React with 👍 / 👎.
|
Codex review: needs real behavior proof before merge. Summary Reproducibility: yes. for the PR defects: source inspection gives a high-confidence path using memory-lancedb with Real behavior proof Next step before merge Security Review findings
Review detailsBest possible solution: Add Azure-aware memory-lancedb request shaping with parsed hostname detection, deployment-scoped embedding URLs, explicit API-version configuration, docs/changelog, focused tests, and redacted live Azure proof while preserving the generic OpenAI-compatible path. Do we have a high-confidence way to reproduce the issue? Yes for the PR defects: source inspection gives a high-confidence path using memory-lancedb with Is this the best way to solve the issue? No. The narrow constructor patch is not the best fix because native Azure support needs URL normalization, bearer-auth suppression, API-version configuration, docs, and runtime proof rather than hard-coded SDK defaults. Full review comments:
Overall correctness: patch is incorrect What I checked:
Likely related people:
Remaining risk / open question:
Codex review notes: model gpt-5.5, reasoning high; reviewed against 29689c62d079. |
805ad58 to
e7e1ffa
Compare
|
Prepared this one for maintainer review/CI.\n\nChanges made on top of the original patch:\n- Rebased onto current |
This PR adds native Azure OpenAI support to the memory-lancedb plugin.
It automatically detects Azure endpoints (via ) and injects the required header and query parameter (defaulting to ).
This allows users to use Azure OpenAI for embeddings without needing an intermediate proxy like LiteLLM or OneAPI, significantly reducing friction for enterprise users.