Skip to content

Commit 2eb4b55

Browse files
authored
Rename experimental_StreamData to StreamData. (#1309)
1 parent 149fe26 commit 2eb4b55

File tree

37 files changed

+113
-185
lines changed

37 files changed

+113
-185
lines changed

.changeset/real-spies-sort.md

Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,5 @@
1+
---
2+
'ai': patch
3+
---
4+
5+
Remove experimental\_ prefix from StreamData.

docs/pages/docs/api-reference.mdx

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -30,7 +30,7 @@ title: API Reference
3030
- [`StreamingTextResponse`](./api-reference/streaming-text-response)
3131
- [`AIStream`](./api-reference/ai-stream)
3232
- [`streamToResponse`](./api-reference/stream-to-response)
33-
- [`experimental_StreamData`](./api-reference/stream-data)
33+
- [`StreamData`](./api-reference/stream-data)
3434

3535
## Prompt Construction Helpers
3636

docs/pages/docs/api-reference/_meta.json

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -5,7 +5,7 @@
55
"use-chat": "useChat",
66
"use-completion": "useCompletion",
77
"ai-stream": "AIStream",
8-
"stream-data": "experimental_StreamData",
8+
"stream-data": "StreamData",
99
"streaming-text-response": "StreamingTextResponse",
1010
"stream-to-response": "streamToResponse",
1111
"tokens": "<Tokens />"

docs/pages/docs/api-reference/providers/inkeep-stream.mdx

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -76,7 +76,7 @@ import {
7676
InkeepStream,
7777
InkeepOnFinalMetadata,
7878
StreamingTextResponse,
79-
experimental_StreamData,
79+
StreamData,
8080
} from 'ai';
8181
import { InkeepAI } from '@inkeep/ai-api';
8282
import type { RecordsCited$ } from '@inkeep/ai-api/models/components';
@@ -122,7 +122,7 @@ export async function POST(req: Request) {
122122
}
123123

124124
// used to pass custom metadata to the client
125-
const data = new experimental_StreamData();
125+
const data = new StreamData();
126126

127127
if (!response?.body) {
128128
throw new Error('Response body is null');
@@ -148,7 +148,7 @@ export async function POST(req: Request) {
148148
}
149149
```
150150

151-
This example uses the [experimental_StreamData](/docs/api-reference/stream-data) and the callback methods of `InkeepStream` to attach metadata to the response.
151+
This example uses the [StreamData](/docs/api-reference/stream-data) and the callback methods of `InkeepStream` to attach metadata to the response.
152152

153153
### Client
154154

docs/pages/docs/api-reference/providers/mistral-stream.mdx

Lines changed: 2 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -53,11 +53,7 @@ export async function POST(req: Request) {
5353

5454
```tsx filename="app/api/completion/route.ts" showLineNumbers
5555
import MistralClient from '@mistralai/mistralai';
56-
import {
57-
MistralStream,
58-
StreamingTextResponse,
59-
experimental_StreamData,
60-
} from 'ai';
56+
import { MistralStream, StreamingTextResponse, StreamData } from 'ai';
6157

6258
const mistral = new MistralClient(process.env.MISTRAL_API_KEY || '');
6359

@@ -75,7 +71,7 @@ export async function POST(req: Request) {
7571
});
7672

7773
// optional: use stream data
78-
const data = new experimental_StreamData();
74+
const data = new StreamData();
7975

8076
data.append({ test: 'value' });
8177

docs/pages/docs/api-reference/stream-data.mdx

Lines changed: 5 additions & 15 deletions
Original file line numberDiff line numberDiff line change
@@ -1,32 +1,22 @@
11
---
2-
title: experimental_StreamData
2+
title: StreamData
33
layout:
44
toc: false
55
---
66

77
import { Callout } from 'nextra-theme-docs';
88

9-
# `experimental_StreamData`
9+
# `StreamData`
1010

11-
The `experimental_StreamData` class allows you to stream arbitrary data to the client alongside your LLM response.
11+
The `StreamData` class allows you to stream arbitrary data to the client alongside your LLM response.
1212
For information on the implementation, see the associated [pull request](https://github.com/vercel/ai/pull/425).
1313

14-
<Callout>
15-
The `experimental_` prefix indicates that the API is not yet stable and may
16-
change in the future without a major version bump.
17-
18-
</Callout>
19-
2014
## Usage
2115

2216
### On the Server
2317

2418
```jsx filename="app/api/chat/route.ts" {24-25,39-40,58-59,62-63,66-67}
25-
import {
26-
OpenAIStream,
27-
StreamingTextResponse,
28-
experimental_StreamData,
29-
} from 'ai';
19+
import { OpenAIStream, StreamingTextResponse, StreamData } from 'ai';
3020
import OpenAI from 'openai';
3121
import type { ChatCompletionCreateParams } from 'openai/resources/chat';
3222

@@ -46,7 +36,7 @@ export async function POST(req: Request) {
4636
});
4737

4838
// Instantiate the StreamData. It works with all API providers.
49-
const data = new experimental_StreamData();
39+
const data = new StreamData();
5040

5141
const stream = OpenAIStream(response, {
5242
experimental_onFunctionCall: async (

docs/pages/docs/api-reference/streaming-react-response.mdx

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -28,14 +28,14 @@ The `experimental_StreamingReactResponse` class is designed to facilitate stream
2828

2929
This parameter should be a `ReadableStream`, which encapsulates the HTTP response's content. It represents the stream from which the response is read and processed.
3030

31-
### `options?: {ui?: Function, data?: experimental_StreamData}`
31+
### `options?: {ui?: Function, data?: StreamData}`
3232

3333
This optional parameter allows additional configurations for rendering React components and handling streamed data.
3434

3535
The options object can include:
3636

3737
- `ui?: (message: {content: string, data?: JSONValue[] | undefined}) => UINode | Promise<UINode>`: A function that receives a message object with `content` and optional `data` fields. This function should return a React component (as `UINode`) for each chunk in the stream. The `data` attribute in the message is available when the `data` option is configured to include stream data.
38-
- `data?: experimental_StreamData`: An instance of `experimental_StreamData` used to process and stream data along with the response.
38+
- `data?: StreamData`: An instance of `StreamData` used to process and stream data along with the response.
3939

4040
## Returns
4141

docs/pages/docs/guides/providers/inkeep.mdx

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -57,7 +57,7 @@ import {
5757
InkeepStream,
5858
InkeepOnFinalMetadata,
5959
StreamingTextResponse,
60-
experimental_StreamData,
60+
StreamData,
6161
} from 'ai';
6262
import { InkeepAI } from '@inkeep/ai-api';
6363
import type { RecordsCited$ } from '@inkeep/ai-api/models/components';
@@ -103,7 +103,7 @@ export async function POST(req: Request) {
103103
}
104104

105105
// used to pass custom metadata to the client
106-
const data = new experimental_StreamData();
106+
const data = new StreamData();
107107

108108
if (!response?.body) {
109109
throw new Error('Response body is null');
@@ -140,7 +140,7 @@ This example leverages a few utilities provided by the Vercel AI SDK:
140140
class with the default headers you probably want (hint: `'Content-Type':
141141
'text/plain; charset=utf-8'` is already set for you). This will provide the streamed content to the client.
142142

143-
3. Lastly, we use the [experimental_StreamData](/docs/api-reference/stream-data) and callback methods of the `InkeepStream` to attach metadata to the response like `onFinalMetadata.chat_session_id` and `records_cited.citations` for use by the client.
143+
3. Lastly, we use the [StreamData](/docs/api-reference/stream-data) and callback methods of the `InkeepStream` to attach metadata to the response like `onFinalMetadata.chat_session_id` and `records_cited.citations` for use by the client.
144144

145145
<Callout>
146146
It's common to save a chat to a database. To do so, you can leverage the

examples/next-inkeep/app/api/chat/route.ts

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -2,7 +2,7 @@ import {
22
InkeepStream,
33
InkeepOnFinalMetadata,
44
StreamingTextResponse,
5-
experimental_StreamData,
5+
StreamData,
66
} from 'ai';
77
import { InkeepAI } from '@inkeep/ai-api';
88
import type { RecordsCited$ } from '@inkeep/ai-api/models/components';
@@ -48,7 +48,7 @@ export async function POST(req: Request) {
4848
}
4949

5050
// used to pass custom metadata to the client
51-
const data = new experimental_StreamData();
51+
const data = new StreamData();
5252

5353
if (!response?.body) {
5454
throw new Error('Response body is null');

examples/next-langchain/app/api/stream-data-basic/route.ts

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -2,7 +2,7 @@ import {
22
StreamingTextResponse,
33
LangChainStream,
44
Message,
5-
experimental_StreamData,
5+
StreamData,
66
} from 'ai';
77
import { ChatOpenAI } from 'langchain/chat_models/openai';
88
import { AIMessage, HumanMessage } from 'langchain/schema';
@@ -12,7 +12,7 @@ export const runtime = 'edge';
1212
export async function POST(req: Request) {
1313
const { messages } = await req.json();
1414

15-
const data = new experimental_StreamData();
15+
const data = new StreamData();
1616

1717
// important: use LangChainStream from the AI SDK:
1818
const { stream, handlers } = LangChainStream({

0 commit comments

Comments
 (0)