feat(typescript-anthropic): add streaming support#20384
feat(typescript-anthropic): add streaming support#20384B-Step62 merged 2 commits intomlflow:masterfrom
Conversation
🛠 DevTools 🛠
Install mlflow from this PRFor Databricks, use the following command: |
There was a problem hiding this comment.
Pull request overview
This pull request adds streaming support for the MLflow Anthropic TypeScript integration, addressing issue #20382. The implementation enables tracing for the messages.stream() method by wrapping the returned MessageStream object with a Proxy that intercepts both async iteration and finalMessage() calls.
Changes:
- Added tracing support for Anthropic's
messages.stream()method - Implemented Proxy-based wrapping for
MessageStreamwith dual patterns: async iteration andfinalMessage() - Added mock SSE endpoint and comprehensive test coverage for streaming scenarios
Reviewed changes
Copilot reviewed 4 out of 4 changed files in this pull request and generated 5 comments.
| File | Description |
|---|---|
| libs/typescript/integrations/anthropic/src/index.ts | Core streaming tracing implementation with Proxy-based MessageStream wrapping and async generator support |
| libs/typescript/integrations/anthropic/tests/index.test.ts | Added three comprehensive tests for streaming: async iteration, finalMessage(), and parent span integration |
| libs/typescript/integrations/anthropic/tests/mockAnthropicServer.ts | Added SSE streaming endpoint mock with proper event sequence simulation |
| docs/docs/genai/tracing/integrations/listing/anthropic.mdx | Updated feature matrix to indicate streaming support |
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
There was a problem hiding this comment.
Pull request overview
Copilot reviewed 4 out of 4 changed files in this pull request and generated 3 comments.
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
There was a problem hiding this comment.
Pull request overview
Copilot reviewed 4 out of 4 changed files in this pull request and generated 2 comments.
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
There was a problem hiding this comment.
Pull request overview
Copilot reviewed 4 out of 4 changed files in this pull request and generated no new comments.
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
|
Documentation preview for 58e9a55 is available at: Changed Pages (1)
More info
|
|
Thank you to whomever approved the action runs! I've fixed the lint/prettier errors and would love another run! |
There was a problem hiding this comment.
Pull request overview
Copilot reviewed 4 out of 4 changed files in this pull request and generated no new comments.
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
B-Step62
left a comment
There was a problem hiding this comment.
LGTM! Left a minor comment.
| span.setAttribute(SpanAttributeKey.TOKEN_USAGE, usage); | ||
| } | ||
| } catch (error) { | ||
| console.debug('Error extracting token usage', error); |
There was a problem hiding this comment.
Let's remove console.debug usage
There was a problem hiding this comment.
seems to be consistent with how the other integrations handle failure to extract token messages. If you do want me to break the pattern, should I just catch (error) {}?
also, apologies I neglected to run lint 🤦 fixed inline with the other integrations and pushed up another commit (I squashed all the previous feedback commits)
There was a problem hiding this comment.
Oh my bad, I wasn't aware we use that extensibly in other flavors. Let's keep it as-is then:)
04f76f1 to
ea8796d
Compare
Add tracing support for the `messages.stream()` method in the MLflow Anthropic TypeScript integration. The `stream` method returns a `MessageStream` object that emits events as the response streams in. This implementation traces streaming by: - Wrapping the returned `MessageStream` object in a Proxy - Intercepting `finalMessage()` calls to create a span around the full streaming operation - Intercepting the async iterator (`Symbol.asyncIterator`) to trace `for await` loops - Extracting token usage from the final message after streaming completes - Setting appropriate span attributes (`MESSAGE_FORMAT: 'anthropic'`, token usage, etc.) - `libs/typescript/integrations/anthropic/src/index.ts` - Main implementation - `libs/typescript/integrations/anthropic/tests/index.test.ts` - Tests - `libs/typescript/integrations/anthropic/tests/mockAnthropicServer.ts` - Mock SSE endpoint Co-Authored-By: Claude <noreply@anthropic.com> Signed-off-by: Joel Dodge <joeldodge@gmail.com>
Related Issues/PRs
#20382What changes are proposed in this pull request?
Add tracing support for the
messages.stream()method in the MLflow Anthropic TypeScript integration.The
streammethod returns aMessageStreamobject that emits events as the response streams in. This implementation traces streaming by:Wrapping the returned
MessageStreamobject in a ProxyIntercepting
finalMessage()calls to create a span around the full streaming operationIntercepting the async iterator (
Symbol.asyncIterator) to tracefor awaitloopsExtracting token usage from the final message after streaming completes
Setting appropriate span attributes (
MESSAGE_FORMAT: 'anthropic', token usage, etc.)libs/typescript/integrations/anthropic/src/index.ts- Main implementationlibs/typescript/integrations/anthropic/tests/index.test.ts- Testslibs/typescript/integrations/anthropic/tests/mockAnthropicServer.ts- Mock SSE endpointHow is this PR tested?
Does this PR require documentation update?
Release Notes
Is this a user-facing change?
Add support for
@anthropic-ai/sdk's streaming responsesWhat component(s), interfaces, languages, and integrations does this PR affect?
Components
area/tracking: Tracking Service, tracking client APIs, autologgingarea/models: MLmodel format, model serialization/deserialization, flavorsarea/model-registry: Model Registry service, APIs, and the fluent client calls for Model Registryarea/scoring: MLflow Model server, model deployment tools, Spark UDFsarea/evaluation: MLflow model evaluation features, evaluation metrics, and evaluation workflowsarea/gateway: MLflow AI Gateway client APIs, server, and third-party integrationsarea/prompts: MLflow prompt engineering features, prompt templates, and prompt managementarea/tracing: MLflow Tracing features, tracing APIs, and LLM tracing functionalityarea/projects: MLproject format, project running backendsarea/uiux: Front-end, user experience, plotting, JavaScript, JavaScript dev serverarea/build: Build and test infrastructure for MLflowarea/docs: MLflow documentation pagesHow should the PR be classified in the release notes? Choose one:
rn/none- No description will be included. The PR will be mentioned only by the PR number in the "Small Bugfixes and Documentation Updates" sectionrn/breaking-change- The PR will be mentioned in the "Breaking Changes" sectionrn/feature- A new user-facing feature worth mentioning in the release notesrn/bug-fix- A user-facing bug fix worth mentioning in the release notesrn/documentation- A user-facing documentation change worth mentioning in the release notesShould this PR be included in the next patch release?
Yesshould be selected for bug fixes, documentation updates, and other small changes.Noshould be selected for new features and larger changes. If you're unsure about the release classification of this PR, leave this unchecked to let the maintainers decide.What is a minor/patch release?
Bug fixes, doc updates and new features usually go into minor releases.
Bug fixes and doc updates usually go into patch releases.