Feature Request: Add Azure AI Foundry support (Azure OpenAI + Foundry Model Inference) with dual auth (API key & Entra ID)
Hi OpenClaw maintainers 👋
I’d like to request first-class support for Azure AI Foundry as a model provider, with secure multi-user authentication.
Why this matters
Azure AI Foundry is not OpenAI-only.
Foundry Models provides access to multiple model families under the same umbrella, including:
- Azure OpenAI
- Mistral AI
- Meta Llama
- Cohere
- AI21
- and others
This is documented in the Foundry Models FAQ:
https://learn.microsoft.com/en-us/azure/ai-foundry/foundry-models/faq?view=foundry-classic
Additionally, Foundry provides an official JavaScript SDK (@azure/ai-projects) that is project-centric and can enumerate deployed model endpoints:
Requested features
1) Provider: azure-openai (OpenAI-compatible endpoints)
Add a provider that supports Azure OpenAI deployments hosted in Foundry.
Authentication must support both:
- API key via the
api-key header
- Microsoft Entra ID via
Authorization: Bearer <token>
Reference:
https://learn.microsoft.com/en-us/azure/ai-foundry/openai/reference?view=foundry-classic
This is essential for safe enterprise multi-user usage (RBAC, revocation, no shared secrets).
2) Provider: foundry-inference (Azure AI Model Inference API)
Add support for the Azure AI Model Inference API, so OpenClaw can use non-OpenAI models available in Foundry (e.g. Mistral, Llama, etc.) through a unified endpoint.
Reference:
https://learn.microsoft.com/en-us/rest/api/aifoundry/modelinference/
Multi-user safety requirement
OpenClaw should allow per-user credentials (auth profiles), so a shared OpenClaw gateway can be used safely without sharing a single API key.
Summary
This feature would enable OpenClaw to:
- Run securely in enterprise Azure environments
- Use Foundry-hosted Azure OpenAI deployments
- Access the broader Foundry model catalog (Mistral, Llama, etc.)
- Support both API key and Entra ID authentication for multi-user safety
Thanks for considering this request!
Feature Request: Add Azure AI Foundry support (Azure OpenAI + Foundry Model Inference) with dual auth (API key & Entra ID)
Hi OpenClaw maintainers 👋
I’d like to request first-class support for Azure AI Foundry as a model provider, with secure multi-user authentication.
Why this matters
Azure AI Foundry is not OpenAI-only.
Foundry Models provides access to multiple model families under the same umbrella, including:
This is documented in the Foundry Models FAQ:
https://learn.microsoft.com/en-us/azure/ai-foundry/foundry-models/faq?view=foundry-classic
Additionally, Foundry provides an official JavaScript SDK (
@azure/ai-projects) that is project-centric and can enumerate deployed model endpoints:SDK overview:
https://learn.microsoft.com/en-us/azure/ai-foundry/how-to/develop/sdk-overview?view=foundry-classic&pivots=programming-language-javascript
@azure/ai-projectspackage docs:https://learn.microsoft.com/en-us/javascript/api/overview/azure/ai-projects-readme?view=azure-node-latest
Requested features
1) Provider:
azure-openai(OpenAI-compatible endpoints)Add a provider that supports Azure OpenAI deployments hosted in Foundry.
Authentication must support both:
api-keyheaderAuthorization: Bearer <token>Reference:
https://learn.microsoft.com/en-us/azure/ai-foundry/openai/reference?view=foundry-classic
This is essential for safe enterprise multi-user usage (RBAC, revocation, no shared secrets).
2) Provider:
foundry-inference(Azure AI Model Inference API)Add support for the Azure AI Model Inference API, so OpenClaw can use non-OpenAI models available in Foundry (e.g. Mistral, Llama, etc.) through a unified endpoint.
Reference:
https://learn.microsoft.com/en-us/rest/api/aifoundry/modelinference/
Multi-user safety requirement
OpenClaw should allow per-user credentials (auth profiles), so a shared OpenClaw gateway can be used safely without sharing a single API key.
Summary
This feature would enable OpenClaw to:
Thanks for considering this request!