AI SDK Integration
Record and replay Vercel AI SDK conversations through the Tapes proxy with zero changes to your application code.
This example demonstrates integrating Tapes with the Vercel AI SDK (opens in new tab) using a custom fetch wrapper. All requests route through the Tapes proxy for recording, search, and replay.
Source code: tapes-ai-sdk-example (opens in new tab)
How It Works
The integration sits between your AI SDK calls and the LLM provider. Tapes captures every request and response without modifying the SDK behavior.
Your App → Tapes Proxy (localhost:8080) → LLM Provider → Storage (SQLite or PostgreSQL) A custom fetch wrapper redirects AI SDK requests through the local Tapes proxy. The proxy forwards them to the provider, records both sides, and returns the response unchanged.
Prerequisites
- Tapes installed
- Node.js 18+
- An OpenAI or Anthropic API key
Setup
Clone the example repo and install dependencies:
git clone https://github.com/papercomputeco/tapes-ai-sdk-example.git
cd tapes-ai-sdk-example
npm install Set your API key:
export OPENAI_API_KEY="sk-..."
# or
export ANTHROPIC_API_KEY="sk-ant-..." Start the Tapes proxy in a separate terminal:
tapes serve For PostgreSQL storage, pass a connection string:
tapes serve --postgres "postgres://user:pass@localhost:5432/tapes" You can also set TAPES_POSTGRES_DSN in your .env file. See PostgreSQL Storage for full setup details.
Run the example:
npm start Try It Out
Open http://localhost:3000 in your browser to start chatting. Send a few messages to generate recorded conversations you can search and replay.
Query Recorded Conversations
After chatting, use the Tapes CLI to search and inspect your recorded conversations:
Semantic Search
Find conversations by meaning:
tapes search "pizza recipe" View History
See recent recorded activity:
tapes log Restore State
Check out a previous conversation checkpoint:
tapes checkout <hash> Provider Wrapper
The tapes/ai.js (opens in new tab) module provides pre-configured models that route through the proxy. Import and use directly with the AI SDK:
import { model } from './tapes/ai.js';
import { streamText } from 'ai';
const result = streamText({
model,
prompt: 'Hello, world!',
});
for await (const chunk of result.textStream) {
process.stdout.write(chunk);
} For full control over provider and model selection, use createTapesProvider():
import { createTapesProvider } from './tapes/ai.js';
const { provider, model, storage } = createTapesProvider({
sessionId: 'my-session',
provider: 'anthropic',
model: 'claude-sonnet-4-5-20250929',
debug: true,
});
console.log(`Storage backend: ${storage}`); // "postgres" or "sqlite" The storage field reflects the active backend. When TAPES_POSTGRES_DSN is set, the wrapper automatically uses PostgreSQL.
Session Tracking
Track conversations per user or context with session IDs. Tapes groups recorded interactions by session for easier search and replay.
import { createSessionModel } from './tapes/ai.js';
// Each session gets its own conversation history
const model = createSessionModel('user-123-chat'); Sessions use the X-Tapes-Session header under the hood. You can also pass custom headers for additional metadata via tapes-fetch.js (opens in new tab):
import { createTapesFetch } from './tapes-fetch.js';
const tapesFetch = createTapesFetch({
proxyUrl: 'http://localhost:8080',
headers: {
'X-Tapes-Session': 'session-id',
'X-Tapes-App': 'my-app',
'X-Tapes-Environment': 'production',
},
});