Add passthrough support for Anthropic Messages API#19423
Add passthrough support for Anthropic Messages API#19423TomeHirata merged 4 commits intomlflow:masterfrom
Conversation
There was a problem hiding this comment.
Pull request overview
This PR adds native Anthropic Messages API passthrough support to the MLflow AI Gateway, enabling users to interact with Anthropic models using the official Anthropic client SDK. It's part of a stacked PR series that introduces LiteLLM provider support and passthrough endpoints for both OpenAI and Anthropic.
Key Changes:
- Added
/gateway/anthropic/v1/messagesendpoint for native Anthropic client compatibility - Implemented passthrough methods in AnthropicProvider supporting both streaming and non-streaming responses
- Introduced LiteLLM provider as a fallback for unsupported providers in the gateway
Reviewed changes
Copilot reviewed 12 out of 12 changed files in this pull request and generated 1 comment.
Show a summary per file
| File | Description |
|---|---|
mlflow/server/gateway_api.py |
Added anthropic_passthrough_messages endpoint handler and helper function _extract_endpoint_name_from_model; Updated _create_provider_from_endpoint_name to use LiteLLM as fallback for unsupported providers |
mlflow/gateway/providers/anthropic.py |
Implemented passthrough_anthropic_messages method for raw Anthropic API request/response handling with streaming support |
mlflow/gateway/providers/openai.py |
Added three passthrough methods: passthrough_openai_chat, passthrough_openai_embeddings, and passthrough_openai_responses for raw OpenAI API compatibility |
mlflow/gateway/providers/base.py |
Added abstract passthrough method definitions for OpenAI and Anthropic APIs in the BaseProvider class |
mlflow/gateway/providers/litellm.py |
New provider implementation using LiteLLM library to support long-tail LLM providers with chat, streaming, and embeddings functionality |
mlflow/gateway/config.py |
Added LiteLLMConfig class with provider, api_key, and api_base configuration options; Added LITELLM to Provider enum |
mlflow/gateway/provider_registry.py |
Registered LiteLLMProvider in the default provider registry |
tests/server/test_gateway_api.py |
Added comprehensive integration tests for OpenAI and Anthropic passthrough endpoints covering both streaming and non-streaming scenarios |
tests/gateway/providers/test_anthropic.py |
Added unit tests for Anthropic passthrough messages with streaming support |
tests/gateway/providers/test_openai.py |
Added unit tests for OpenAI passthrough endpoints including Azure OpenAI edge case |
tests/gateway/providers/test_litellm.py |
New test file with comprehensive coverage for LiteLLM provider chat, embeddings, and streaming functionality |
docs/api_reference/api_inventory.txt |
Added LiteLLMConfig API reference entries |
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
|
Documentation preview for 6676b58 is available at: More info
|
6792c82 to
130c747
Compare
46dd15a to
b70032e
Compare
b5adf26 to
1c4e90c
Compare
B-Step62
left a comment
There was a problem hiding this comment.
LGTM, with a suggestion to DRY.
| provider_path = self.PASSTHROUGH_PROVIDER_PATHS.get(action) | ||
| if provider_path is None: | ||
| route = PASSTHROUGH_ROUTES.get(action) | ||
| supported_routes = ", ".join( | ||
| f"/gateway{route} (provider_path: {path})" | ||
| for act in self.PASSTHROUGH_PROVIDER_PATHS.keys() | ||
| if (route := PASSTHROUGH_ROUTES.get(act)) | ||
| and (path := self.PASSTHROUGH_PROVIDER_PATHS.get(act)) | ||
| ) | ||
| raise AIGatewayException( | ||
| status_code=400, | ||
| detail=f"Unsupported passthrough endpoint '{route}' for {self.NAME} provider. " | ||
| f"Supported endpoints: {supported_routes}", | ||
| ) |
There was a problem hiding this comment.
Shall we move this logic to base class like def get_passthrough_provider_path? Looks generic enough.
393d14c to
d237933
Compare
Signed-off-by: Tomu Hirata <tomu.hirata@gmail.com>
🥞 Stacked PR
Use this link to review incremental changes.
Related Issues/PRs
n/a
What changes are proposed in this pull request?
Add an Anthropic model passthrough endpoint to support Anthropic client natively where request body is just propagated to the provider endpoint:
How is this PR tested?
Does this PR require documentation update?
Release Notes
Is this a user-facing change?
What component(s), interfaces, languages, and integrations does this PR affect?
Components
area/tracking: Tracking Service, tracking client APIs, autologgingarea/models: MLmodel format, model serialization/deserialization, flavorsarea/model-registry: Model Registry service, APIs, and the fluent client calls for Model Registryarea/scoring: MLflow Model server, model deployment tools, Spark UDFsarea/evaluation: MLflow model evaluation features, evaluation metrics, and evaluation workflowsarea/gateway: MLflow AI Gateway client APIs, server, and third-party integrationsarea/prompts: MLflow prompt engineering features, prompt templates, and prompt managementarea/tracing: MLflow Tracing features, tracing APIs, and LLM tracing functionalityarea/projects: MLproject format, project running backendsarea/uiux: Front-end, user experience, plotting, JavaScript, JavaScript dev serverarea/build: Build and test infrastructure for MLflowarea/docs: MLflow documentation pagesHow should the PR be classified in the release notes? Choose one:
rn/none- No description will be included. The PR will be mentioned only by the PR number in the "Small Bugfixes and Documentation Updates" sectionrn/breaking-change- The PR will be mentioned in the "Breaking Changes" sectionrn/feature- A new user-facing feature worth mentioning in the release notesrn/bug-fix- A user-facing bug fix worth mentioning in the release notesrn/documentation- A user-facing documentation change worth mentioning in the release notesShould this PR be included in the next patch release?
Yesshould be selected for bug fixes, documentation updates, and other small changes.Noshould be selected for new features and larger changes. If you're unsure about the release classification of this PR, leave this unchecked to let the maintainers decide.What is a minor/patch release?
Bug fixes, doc updates and new features usually go into minor releases.
Bug fixes and doc updates usually go into patch releases.