Skip to content

Commit 03a04f6

Browse files
Backport: feat(google-vertex): add support for streaming tool arguments input (#14267)
This is an automated backport of #13929 to the release-v6.0 branch. FYI @aayush-kapoor This backport has conflicts that need to be resolved manually. ### `git cherry-pick` output ``` Auto-merging packages/google/src/google-generative-ai-language-model.test.ts Auto-merging packages/google/src/google-generative-ai-language-model.ts CONFLICT (content): Merge conflict in packages/google/src/google-generative-ai-language-model.ts Auto-merging packages/google/src/google-prepare-tools.ts error: could not apply 5036db8... feat(google-vertex): add support for streaming tool arguments input (#13929) hint: After resolving the conflicts, mark them with hint: "git add/rm <pathspec>", then run hint: "git cherry-pick --continue". hint: You can instead skip this commit with "git cherry-pick --skip". hint: To abort and get back to the state before "git cherry-pick", hint: run "git cherry-pick --abort". hint: Disable this message with "git config set advice.mergeConflict false" ``` --------- Co-authored-by: Aayush Kapoor <83492835+aayush-kapoor@users.noreply.github.com> Co-authored-by: Aayush Kapoor <aayushkapoor34@gmail.com>
1 parent 0cbc7cc commit 03a04f6

File tree

15 files changed

+2590
-29
lines changed

15 files changed

+2590
-29
lines changed

.changeset/itchy-frogs-sparkle.md

Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,5 @@
1+
---
2+
"@ai-sdk/google": patch
3+
---
4+
5+
feat(google-vertex): add support for streaming tool arguments input

content/providers/01-ai-sdk-providers/16-google-vertex.mdx

Lines changed: 63 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -345,6 +345,15 @@ The following optional provider options are available for Google Vertex models:
345345

346346
Consult [Google's Documentation](https://cloud.google.com/vertex-ai/generative-ai/docs/multimodal/add-labels-to-api-calls) for usage details.
347347

348+
- **streamFunctionCallArguments** _boolean_
349+
350+
Optional. When set to true, function call arguments will be streamed
351+
incrementally in streaming responses. This enables `tool-input-delta` events
352+
to arrive as the model generates function call arguments, reducing perceived
353+
latency for tool calls. Defaults to `true` for Vertex AI providers. Only supported on the Vertex AI API (not the Gemini API).
354+
355+
Consult [Google's Documentation](https://docs.cloud.google.com/vertex-ai/generative-ai/docs/multimodal/function-calling#streaming-fc) for details.
356+
348357
You can use Google Vertex language models to generate text with the `generateText` function:
349358

350359
```ts highlight="1,4"
@@ -454,6 +463,60 @@ const result = await generateText({
454463

455464
The optional `retrievalConfig.latLng` provider option provides location context for queries about nearby places. This configuration applies to any grounding tools that support location context.
456465

466+
#### Streaming Function Call Arguments
467+
468+
For Gemini 3 Pro and later models on Vertex AI, you can stream function call
469+
arguments as they are generated by setting `streamFunctionCallArguments` to
470+
`true`. This reduces perceived latency when functions need to be called, as
471+
`tool-input-delta` events arrive incrementally instead of waiting for the
472+
complete arguments. This option is `true` by default and you can opt out by
473+
setting it to false.
474+
475+
```ts
476+
import { vertex } from '@ai-sdk/google-vertex';
477+
import { type GoogleLanguageModelOptions } from '@ai-sdk/google';
478+
import { streamText } from 'ai';
479+
import { z } from 'zod';
480+
481+
const result = streamText({
482+
model: vertex('gemini-3.1-pro-preview'),
483+
prompt: 'What is the weather in Boston and San Francisco?',
484+
tools: {
485+
getWeather: {
486+
description: 'Get the current weather in a given location',
487+
inputSchema: z.object({
488+
location: z.string().describe('City name'),
489+
}),
490+
},
491+
},
492+
providerOptions: {
493+
vertex: {
494+
streamFunctionCallArguments: false,
495+
} satisfies GoogleLanguageModelOptions,
496+
},
497+
});
498+
499+
for await (const part of result.fullStream) {
500+
switch (part.type) {
501+
case 'tool-input-start':
502+
console.log(`Tool call started: ${part.toolName}`);
503+
break;
504+
case 'tool-input-delta':
505+
process.stdout.write(part.delta);
506+
break;
507+
case 'tool-call':
508+
console.log(`Tool call complete: ${part.toolName}`, part.input);
509+
break;
510+
}
511+
}
512+
```
513+
514+
<Note>
515+
This feature is only available on the Vertex AI API. It is not supported on
516+
the Gemini API. When used with the Google Generative AI provider, a warning
517+
will be emitted and the option will be ignored.
518+
</Note>
519+
457520
#### Reasoning (Thinking Tokens)
458521

459522
Google Vertex AI, through its support for Gemini models, can also emit "thinking" tokens, representing the model's reasoning process. The AI SDK exposes these as reasoning information.
Lines changed: 63 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,63 @@
1+
import { vertex } from '@ai-sdk/google-vertex';
2+
import { type GoogleLanguageModelOptions } from '@ai-sdk/google';
3+
import { convertToModelMessages, streamText, UIDataTypes, UIMessage } from 'ai';
4+
import { z } from 'zod';
5+
6+
export const maxDuration = 60;
7+
8+
export type VertexStreamingToolCallsMessage = UIMessage<
9+
never,
10+
UIDataTypes,
11+
{
12+
showWeatherInformation: {
13+
input: {
14+
city: string;
15+
weather: string;
16+
temperature: number;
17+
description: string;
18+
};
19+
output: string;
20+
};
21+
}
22+
>;
23+
24+
export async function POST(req: Request) {
25+
const { messages } = await req.json();
26+
27+
const result = streamText({
28+
model: vertex('gemini-3.1-pro-preview'),
29+
messages: await convertToModelMessages(messages),
30+
system:
31+
'You are a helpful weather assistant. ' +
32+
'Use getWeatherInformation to fetch weather data, then use showWeatherInformation to display it to the user. ' +
33+
'Always show the weather using the showWeatherInformation tool.',
34+
tools: {
35+
getWeatherInformation: {
36+
description: 'Get the current weather for a city',
37+
inputSchema: z.object({ city: z.string() }),
38+
execute: async ({ city }: { city: string }) => {
39+
const conditions = ['sunny', 'cloudy', 'rainy', 'snowy', 'windy'];
40+
return {
41+
city,
42+
weather: conditions[Math.floor(Math.random() * conditions.length)],
43+
temperature: Math.floor(Math.random() * 50 - 10),
44+
};
45+
},
46+
},
47+
showWeatherInformation: {
48+
description:
49+
'Show weather information to the user. Always use this tool to present weather data.',
50+
inputSchema: z.object({
51+
city: z.string(),
52+
weather: z.string(),
53+
temperature: z.number(),
54+
description: z
55+
.string()
56+
.describe('A brief description of the weather conditions.'),
57+
}),
58+
},
59+
},
60+
});
61+
62+
return result.toUIMessageStreamResponse();
63+
}
Lines changed: 116 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,116 @@
1+
'use client';
2+
3+
import { useChat } from '@ai-sdk/react';
4+
import ChatInput from '@/components/chat-input';
5+
import {
6+
DefaultChatTransport,
7+
lastAssistantMessageIsCompleteWithToolCalls,
8+
} from 'ai';
9+
import { VertexStreamingToolCallsMessage } from '@/app/api/chat/vertex-streaming-tool-calls/route';
10+
11+
export default function Chat() {
12+
const { messages, status, sendMessage, addToolOutput } =
13+
useChat<VertexStreamingToolCallsMessage>({
14+
transport: new DefaultChatTransport({
15+
api: '/api/chat/vertex-streaming-tool-calls',
16+
}),
17+
18+
sendAutomaticallyWhen: lastAssistantMessageIsCompleteWithToolCalls,
19+
20+
async onToolCall({ toolCall }) {
21+
if (toolCall.toolName === 'showWeatherInformation') {
22+
addToolOutput({
23+
tool: 'showWeatherInformation',
24+
toolCallId: toolCall.toolCallId,
25+
output: 'Weather information displayed to user.',
26+
});
27+
}
28+
},
29+
});
30+
31+
let lastRole: string | undefined = undefined;
32+
33+
return (
34+
<div className="flex flex-col py-24 mx-auto w-full max-w-md stretch">
35+
<h1 className="text-lg font-bold mb-4">
36+
Vertex AI — Streaming Tool Call Arguments
37+
</h1>
38+
39+
{messages?.map(m => {
40+
const isNewRole = m.role !== lastRole;
41+
lastRole = m.role;
42+
43+
return (
44+
<div key={m.id} className="whitespace-pre-wrap mb-2">
45+
{isNewRole && (
46+
<strong className="block mb-1">{`${m.role}: `}</strong>
47+
)}
48+
{m.parts.map((part, i) => {
49+
if (part.type === 'text') {
50+
return <span key={i}>{part.text}</span>;
51+
}
52+
53+
if (part.type === 'tool-showWeatherInformation') {
54+
if (part.state === 'input-streaming') {
55+
return (
56+
<div
57+
key={i}
58+
className="p-3 my-2 rounded border border-blue-300 bg-blue-50"
59+
>
60+
<div className="text-xs font-mono text-blue-600 mb-1">
61+
streaming tool args…
62+
</div>
63+
<pre className="text-sm">
64+
{JSON.stringify(part.input, null, 2)}
65+
</pre>
66+
</div>
67+
);
68+
}
69+
70+
if (part.state === 'input-available') {
71+
return (
72+
<div
73+
key={i}
74+
className="p-3 my-2 rounded border border-yellow-300 bg-yellow-50"
75+
>
76+
<div className="text-xs text-yellow-700 mb-1">
77+
tool call complete — awaiting result
78+
</div>
79+
<pre className="text-sm">
80+
{JSON.stringify(part.input, null, 2)}
81+
</pre>
82+
</div>
83+
);
84+
}
85+
86+
if (part.state === 'output-available') {
87+
return (
88+
<div
89+
key={i}
90+
className="p-4 my-2 rounded border border-green-300 bg-green-50"
91+
>
92+
<h4 className="font-semibold mb-1">{part.input.city}</h4>
93+
<div className="flex gap-3 text-sm">
94+
<span>🌡 {part.input.temperature}°C</span>
95+
<span>{part.input.weather}</span>
96+
</div>
97+
{part.input.description && (
98+
<p className="mt-1 text-sm text-gray-600">
99+
{part.input.description}
100+
</p>
101+
)}
102+
</div>
103+
);
104+
}
105+
}
106+
107+
return null;
108+
})}
109+
</div>
110+
);
111+
})}
112+
113+
<ChatInput status={status} onSubmit={text => sendMessage({ text })} />
114+
</div>
115+
);
116+
}
Lines changed: 53 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,53 @@
1+
import { vertex } from '@ai-sdk/google-vertex';
2+
import { streamText } from 'ai';
3+
import { z } from 'zod';
4+
import { run } from '../../lib/run';
5+
import { saveRawChunks } from '../../lib/save-raw-chunks';
6+
7+
run(async () => {
8+
const result = streamText({
9+
model: vertex('gemini-3.1-pro-preview'),
10+
prompt: 'What is the weather in Boston and San Francisco?',
11+
tools: {
12+
getWeather: {
13+
description: 'Get the current weather in a given location',
14+
inputSchema: z.object({
15+
location: z.string().describe('City name'),
16+
}),
17+
},
18+
},
19+
includeRawChunks: true,
20+
});
21+
22+
for await (const part of result.fullStream) {
23+
switch (part.type) {
24+
case 'tool-input-start':
25+
console.log(`\n[tool-input-start] ${part.toolName} (${part.id})`);
26+
break;
27+
case 'tool-input-delta':
28+
process.stdout.write(part.delta);
29+
break;
30+
case 'tool-input-end':
31+
console.log(`\n[tool-input-end] (${part.id})`);
32+
break;
33+
case 'tool-call':
34+
console.log(`\n[tool-call] ${part.toolName}:`, part.input);
35+
break;
36+
case 'text-delta':
37+
process.stdout.write(part.text);
38+
break;
39+
case 'finish':
40+
console.log('\nFinish reason:', part.finishReason);
41+
console.log('Usage:', part.totalUsage);
42+
break;
43+
case 'error':
44+
console.error('Error:', part.error);
45+
break;
46+
}
47+
}
48+
49+
await saveRawChunks({
50+
result,
51+
filename: 'google-vertex-stream-function-call-args-default.1',
52+
});
53+
});
Lines changed: 62 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,62 @@
1+
import { vertex } from '@ai-sdk/google-vertex';
2+
import { streamText } from 'ai';
3+
import { z } from 'zod';
4+
import { run } from '../../lib/run';
5+
import { saveRawChunks } from '../../lib/save-raw-chunks';
6+
7+
run(async () => {
8+
const result = streamText({
9+
model: vertex('gemini-3.1-pro-preview'),
10+
prompt: 'Cook me a lasagna.',
11+
tools: {
12+
cookRecipe: {
13+
description: 'Cook a recipe',
14+
inputSchema: z.object({
15+
recipe: z.object({
16+
name: z.string(),
17+
ingredients: z.array(
18+
z.object({
19+
name: z.string(),
20+
amount: z.string(),
21+
}),
22+
),
23+
steps: z.array(z.string()),
24+
}),
25+
}),
26+
},
27+
},
28+
includeRawChunks: true,
29+
});
30+
31+
for await (const part of result.fullStream) {
32+
switch (part.type) {
33+
case 'tool-input-start':
34+
console.log(`\n[tool-input-start] ${part.toolName} (${part.id})`);
35+
break;
36+
case 'tool-input-delta':
37+
process.stdout.write(part.delta);
38+
break;
39+
case 'tool-input-end':
40+
console.log(`\n[tool-input-end] (${part.id})`);
41+
break;
42+
case 'tool-call':
43+
console.log(`\n[tool-call] ${part.toolName}:`, part.input);
44+
break;
45+
case 'text-delta':
46+
process.stdout.write(part.text);
47+
break;
48+
case 'finish':
49+
console.log('\nFinish reason:', part.finishReason);
50+
console.log('Usage:', part.totalUsage);
51+
break;
52+
case 'error':
53+
console.error('Error:', part.error);
54+
break;
55+
}
56+
}
57+
58+
await saveRawChunks({
59+
result,
60+
filename: 'google-vertex-stream-function-call-args-default.1',
61+
});
62+
});
Lines changed: 8 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,8 @@
1+
{"candidates":[{"content":{"role":"model","parts":[{"functionCall":{"name":"getWeather","willContinue":true},"thoughtSignature":"CiMBjz1rX25KieIB4d4AwFn8/WbsHTRNHBXso88PtemwfsSfRAp6AY89a1/e2FrmrE2HtlOxvV7vy8Kn5Og9n6CepcBKYW9aBT5QdXtTD4PGv9pDDWLEDAfsd86Et6k7HDEV976M3kC0ahLS8FCrwRntUKr+GF5uiUfPvgpzqNl5m8d8EwYykfwo1VwuTStR7q3mr/TzDcNzepafpVLdZloKYQGPPWtfkohmdau0GqGi3DuPbFTUBe5Ac/UbCvUk+P7KFbCGb93vFmnjmmTV7228DcAddqtWfe4iVePtSumw3JptYE6PIeVao2r8q7FXZKazzEiA9c04nmq6Vv9lOtt5NsgKrwEBjz1rX1ajIx3yeG/QI78uO+MsKoqgwrCI7PAEaDRqbyDCW/NX/4Vfuz1wp96oT3h/ttJ5ejzQyGamOrYreUUFvJBR/0TvFBdWOXwwCOWtvJ0Jh7SinV9HNqBDM1WWH8e+tGWY0r7xU/o/M1Fkwd73bvyp3BmSm5orrQL4jUZ1TN6ECqw7T7wHEkIfuBXGMZhQwAPzaapGVCdn0LjFJwd1aVReWmt6pE4ut993/K+0CqsBAY89a1+ts0Nggt7GZxVw/dQhH2Gc6KxRbAZKFMTuYRcNufJ5CyTvq85rMWEhTPWQxlQe1V7NIOlsQzyDhlejiCVbq0chDVrwRY23BmLrhSLi+SfKukEc0M+V9LdFiqxxLgUBrVdCvxA17g11j6HGdmXiNze2WDKm7ZXgMYvEiVmglpRZOb4JeJzxNPFsxhv2k6VkKB+GRtO48XwNz/i2taQ8Nq+LO5tED6HhCp4BAY89a18wBKoSkBT2IRo530FeNq/lpVozY0+k+p+H//TAtKoWZBamKCwra81idaMm1TTaWZGlX85Pq0kBvx1sQLFEhILeQrbH0O3Iuk9JlKLKctAwnBj8BZUGJEGtbigTKaSgWcMBhW+jfKWi1xwSxyJLRkIcXmE8U3FE/XHCLeiBy0NRa0yKn3xh62JU9087Q2nOMClxy5Zy5VsZ0qQ="}]}}],"usageMetadata":{"trafficType":"ON_DEMAND"},"modelVersion":"gemini-3.1-pro-preview","createTime":"2026-04-02T17:03:50.399550Z","responseId":"dqHOab6xGLzWodAPkPuViA4"}
2+
{"candidates":[{"content":{"role":"model","parts":[{"functionCall":{"partialArgs":[{"jsonPath":"$.location","stringValue":"Boston","willContinue":true}],"willContinue":true}}]}}],"usageMetadata":{"trafficType":"ON_DEMAND"},"modelVersion":"gemini-3.1-pro-preview","createTime":"2026-04-02T17:03:50.399550Z","responseId":"dqHOab6xGLzWodAPkPuViA4"}
3+
{"candidates":[{"content":{"role":"model","parts":[{"functionCall":{"partialArgs":[{"jsonPath":"$.location","stringValue":""}],"willContinue":true}}]}}],"usageMetadata":{"trafficType":"ON_DEMAND"},"modelVersion":"gemini-3.1-pro-preview","createTime":"2026-04-02T17:03:50.399550Z","responseId":"dqHOab6xGLzWodAPkPuViA4"}
4+
{"candidates":[{"content":{"role":"model","parts":[{"functionCall":{}}]}}],"usageMetadata":{"trafficType":"ON_DEMAND"},"modelVersion":"gemini-3.1-pro-preview","createTime":"2026-04-02T17:03:50.399550Z","responseId":"dqHOab6xGLzWodAPkPuViA4"}
5+
{"candidates":[{"content":{"role":"model","parts":[{"functionCall":{"name":"getWeather","willContinue":true}}]}}],"usageMetadata":{"trafficType":"ON_DEMAND"},"modelVersion":"gemini-3.1-pro-preview","createTime":"2026-04-02T17:03:50.399550Z","responseId":"dqHOab6xGLzWodAPkPuViA4"}
6+
{"candidates":[{"content":{"role":"model","parts":[{"functionCall":{"partialArgs":[{"jsonPath":"$.location","stringValue":"San Francisco","willContinue":true}],"willContinue":true}}]}}],"usageMetadata":{"trafficType":"ON_DEMAND"},"modelVersion":"gemini-3.1-pro-preview","createTime":"2026-04-02T17:03:50.399550Z","responseId":"dqHOab6xGLzWodAPkPuViA4"}
7+
{"candidates":[{"content":{"role":"model","parts":[{"functionCall":{"partialArgs":[{"jsonPath":"$.location","stringValue":""}],"willContinue":true}}]}}],"usageMetadata":{"trafficType":"ON_DEMAND"},"modelVersion":"gemini-3.1-pro-preview","createTime":"2026-04-02T17:03:50.399550Z","responseId":"dqHOab6xGLzWodAPkPuViA4"}
8+
{"candidates":[{"content":{"role":"model","parts":[{"functionCall":{}}]},"finishReason":"STOP"}],"usageMetadata":{"promptTokenCount":26,"candidatesTokenCount":23,"totalTokenCount":181,"trafficType":"ON_DEMAND","promptTokensDetails":[{"modality":"TEXT","tokenCount":26}],"candidatesTokensDetails":[{"modality":"TEXT","tokenCount":23}],"thoughtsTokenCount":132},"modelVersion":"gemini-3.1-pro-preview","createTime":"2026-04-02T17:03:50.399550Z","responseId":"dqHOab6xGLzWodAPkPuViA4"}

0 commit comments

Comments
 (0)