Skip to content

future-agi/agent-command-center-sdk

Future AGI — Client SDKs for the Agent Command Center

Client SDKs for the Agent Command Center

Python and TypeScript SDKs — plus LangChain, LlamaIndex, React, and Vercel AI SDK integrations — for Agent Command Center, Future AGI's open-source, OpenAI-compatible AI gateway.

Apache 2.0 PyPI npm Python 3.9+ Node 18+ Discord GitHub stars

Quickstart · Gateway repo · Docs · Try Cloud (Free) · Discord · Discussions


What's in here

This repo ships the client SDKs for the Agent Command Center. The gateway itself — the Go service that handles routing, caching, guardrails, cost tracking, and the OpenAI-compatible endpoint — lives in the future-agi/future-agi platform monorepo. These SDKs give you typed Python and TypeScript clients, plus integrations for LangChain, LlamaIndex, React, and the Vercel AI SDK.

If you already know OpenAI's SDK, you already know these. Swap OpenAI for AgentCC, point base_url at the gateway, and every gateway feature — multi-provider routing, semantic caching, inline guardrails, per-key budgets — is available through the same call.


Agent Command Center SDK — one OpenAI-compatible call, 100+ providers, routing, caching, guardrails

Packages

Package Runtime Install Purpose
agentcc Python 3.9+ pip install agentcc Core Python client — sync + async, streaming, tools, structured output
@agentcc/client Node 18+ npm install @agentcc/client Core TypeScript client — ESM + CJS, fully typed
@agentcc/langchain Node 18+ npm install @agentcc/langchain Drop-in ChatOpenAI replacement for LangChain.js
@agentcc/llamaindex Node 18+ npm install @agentcc/llamaindex LLM + embedding integration for LlamaIndex.TS
@agentcc/react Node 18+ npm install @agentcc/react React context + hooks for chat UIs
@agentcc/vercel Node 18+ npm install @agentcc/vercel Vercel AI SDK provider

Features

  • OpenAI-compatible surface. Chat, completions, embeddings, images, audio, moderations, files, batches, rerank, responses — same shape as OpenAI's SDK. Migrating is a one-line change.
  • 100+ providers through one endpoint. OpenAI, Anthropic, Google, Vertex AI, Bedrock, Azure, Groq, Together, Mistral, Fireworks, Ollama, vLLM — whatever you pick, your SDK call doesn't change.
  • 15 routing strategies, surfaced per request. Fallback chains, shadow traffic, latency-aware routing, cost-optimised selection, circuit breakers — configured via a typed config option or gateway-side virtual keys.
  • Streaming, tool calling, structured output. Iterator patterns in Python, async iterables in TypeScript — both fully typed.
  • Inline guardrails + cost tracking. Every request can carry a guardrail policy and a budget header. The gateway enforces both; the SDK surfaces the results.
  • Framework integrations that don't rewrap. @agentcc/langchain is a genuine BaseChatModel. @agentcc/vercel is a real AI SDK provider. @agentcc/llamaindex implements LlamaIndex's LLM and BaseEmbedding. Use them the way you use the originals.

🚀 Quickstart

Get a key at app.futureagi.com (free tier available) or self-host the gateway — then:

Python

pip install agentcc
import os
from agentcc import AgentCC

client = AgentCC(
    api_key=os.environ["AGENTCC_API_KEY"],
    base_url="https://gateway.futureagi.com/v1",
)

response = client.chat.completions.create(
    model="gpt-4o",
    messages=[{"role": "user", "content": "Hello!"}],
)
print(response.choices[0].message.content)

TypeScript

npm install @agentcc/client
import { AgentCC } from "@agentcc/client";

const client = new AgentCC({
  apiKey: process.env.AGENTCC_API_KEY,
  baseUrl: "https://gateway.futureagi.com/v1",
});

const response = await client.chat.completions.create({
  model: "gpt-4o",
  messages: [{ role: "user", content: "Hello!" }],
});
console.log(response.choices[0].message.content);

Set AGENTCC_API_KEY and AGENTCC_BASE_URL in your environment and both clients pick them up automatically.


Gateway features through the SDK

Everything the gateway supports — routing strategies, caching, guardrails, budgets — is available per request via a config option. No separate API to learn.

import { AgentCC, type GatewayConfig } from "@agentcc/client";

const config: GatewayConfig = {
  strategy: "fallback",
  targets: [
    { provider: "openai", model: "gpt-4o" },
    { provider: "anthropic", model: "claude-sonnet-4-20250514" },
  ],
};

const client = new AgentCC({
  apiKey: process.env.AGENTCC_API_KEY,
  baseUrl: "https://gateway.futureagi.com/v1",
  config,
});

Gateway docs → · Routing strategies → · Guardrails →


Framework integrations

Package What you get
@agentcc/langchain ChatAgentCC (drop-in ChatOpenAI), AgentCCEmbeddings, AgentCCCallbackHandler for unified observability across LangChain + gateway
@agentcc/llamaindex AgentCCLLM, AgentCCEmbedding — pass them to any LlamaIndex pipeline that accepts an LLM or embedding model
@agentcc/react AgentCCProvider, useAgentCCChat (streaming), useAgentCCCompletion, useAgentCCObject (structured output)
@agentcc/vercel createAgentCC() provider for generateText / streamText — tools, structured output, and multi-step loops pass through

Each integration has its own README with full examples.


Related Future AGI repos

These SDKs are one slice of the Future AGI platform — an open-source stack for making AI agents reliable. You can use them standalone against any Agent Command Center deployment, or alongside the rest.

Repo What it is
future-agi/future-agi Platform monorepo — the gateway itself, evaluations, simulations, tracing, prompt optimization
future-agi/traceAI OpenTelemetry-native instrumentation for 50+ AI frameworks
future-agi/ai-evaluation 50+ evaluation metrics + guardrail scanners
future-agi/agent-opt Six prompt-optimization algorithms (GEPA, PromptWizard, and more)

Requirements


Documentation


🤝 Contributing

Contributions welcome — bug fixes, new framework integrations, examples, docs improvements, anything.

  1. Browse good first issue
  2. Read the Contributing Guide
  3. Say hi on Discord or Discussions

Security reports: see SECURITY.md.


🌍 Community & support

💬 Discord Real-time help from the team and community
🗨️ GitHub Discussions Ideas, questions, roadmap input
🐦 Twitter / X Release announcements
📝 Blog Engineering & research posts
📧 support@futureagi.com Cloud account / billing
🔐 security@futureagi.com Private vulnerability disclosure (24 h ack — see SECURITY.md)

⭐ Star history

Star history

📄 License

Apache License 2.0 — see LICENSE.


About

SDK Client for Agent Command Center

Topics

Resources

License

Code of conduct

Contributing

Security policy

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors