This example shows how to use cascadeflow as the backend for the Vercel AI SDK UI hook useChat in a Next.js App Router project running on the Edge runtime.
pnpm -C ../../packages/core build
pnpm -C ../../packages/integrations/vercel-ai build
cd examples/vercel-ai-nextjs
pnpm install
pnpm devpnpm dev and pnpm build auto-build local workspace packages when they exist (monorepo usage), and skip that step in standalone deployments (for example Vercel project-root deploys).
next.config.js also auto-switches module resolution: local monorepo builds when present, published npm packages when deployed standalone.
ai+@ai-sdk/reactalready in your app (this example includes them).- Any provider keys you want to use (for example
OPENAI_API_KEY).
export OPENAI_API_KEY=...
export ANTHROPIC_API_KEY=...- Multi-turn chat (
messageslist) withuseChat. - Streaming responses (
dataprotocol, plus automatic UI message stream support on newer AI SDKs). - UI message
partspayloads are accepted by the backend handler. - Works with cascadeflow tool loops via
toolExecutorortoolHandlersincreateChatHandler(...).
- Trivial text chat: pass plain
messages. - Single tool-call planning: provide
toolsand optionalextra.tool_choice. - Tool-loop execution: add
toolExecutorortoolHandlersplusmaxSteps. - Multi-tool continuation: send assistant/tool message-list turns for closed-loop workflows.
vercel link --yes --project cascadeflow-vercel-ai-nextjs-sandbox
vercel env add OPENAI_API_KEY production
vercel env add OPENAI_API_KEY preview
vercel deployOn some Vercel team plans, newly created projects can default to deployment protection (ssoProtection).
If enabled, direct API probes to /api/chat can return 401 even when the route is healthy.
For sandbox E2E validation, disable protection on the sandbox project (or allow unauthenticated access for its domain).
After deploy, validate the real network path (not only local tests):
DEPLOY_URL="https://<your-deployment>.vercel.app"
curl -sS -X POST "$DEPLOY_URL/api/chat" \
-H "content-type: application/json" \
--data '{"messages":[{"role":"user","content":"Reply with: cascadeflow-ok"}]}'Expected result: a streaming response payload containing assistant text (for example cascadeflow-ok).
app/api/chat/route.tsuses@cascadeflow/vercel-aicreateChatHandler(...)withprotocol: 'data'.- On AI SDK v4, this emits the data stream protocol expected by
useChat. - On newer AI SDK versions, the handler automatically uses the UI message stream when available.
- On AI SDK v4, this emits the data stream protocol expected by
app/page.tsxusesuseChat()(default route is/api/chat).