A unified, batteries-included client for embedding APIs that actually works.
The world of embedding API clients is broken. (details)
- Everyone defaults to OpenAI's client for embeddings, even though it wasn't designed for that purpose
- Provider-specific libraries (VoyageAI, Cohere, etc.) are inconsistent, poorly maintained, or outright broken
- Universal clients like LiteLLM and any-llm-sdk don't focus on embeddings at allβthey rely on native client libraries, inheriting all their problems
- Every provider has different capabilitiesβsome support dimension changes, others don'tβwith no standardized way to discover what's available
- Most clients lack basic features like retry logic, proper error handling, and usage tracking
- There's no single source of truth for model metadata, pricing, or capabilities
Catsu fixes this. It's a high-performance, unified client built specifically for embeddings with:
π― A clean, consistent API across all providers
π Built-in retry logic with exponential backoff
π° Automatic usage and cost tracking
π Rich model metadata and capability discovery
β‘ Rust core with Python bindings for maximum performance
Add to your Cargo.toml:
[dependencies]
catsu = "0.1"
tokio = { version = "1", features = ["full"] }use catsu::Client;
#[tokio::main]
async fn main() -> Result<(), Box<dyn std::error::Error>> {
// Create client (reads API keys from environment)
let client = Client::new()?;
// Generate embeddings
let response = client.embed(
"openai:text-embedding-3-small",
vec!["Hello, world!".to_string(), "How are you?".to_string()],
).await?;
println!("Dimensions: {}", response.dimensions);
println!("Tokens used: {}", response.usage.tokens);
println!("Embedding: {:?}", &response.embeddings[0][..5]);
Ok(())
}use catsu::{Client, InputType};
let response = client.embed_with_options(
"openai:text-embedding-3-small",
vec!["Search query".to_string()],
Some(InputType::Query), // input type hint
Some(256), // output dimensions
).await?;// List all available models
let models = client.list_models(None);
// Filter by provider
let openai_models = client.list_models(Some("openai"));
for model in openai_models {
println!("{}: {} dims", model.name, model.dimensions);
}Looking for Python? See the Python documentation.
pip install catsufrom catsu import Client
client = Client()
response = client.embed("openai:text-embedding-3-small", ["Hello, world!"])
print(f"Dimensions: {response.dimensions}")Can't find your favorite model or provider? Open an issue and we'll add it!
For guidelines on contributing, see CONTRIBUTING.md.
If you found this helpful, consider giving it a β!
made with β€οΈ by chonkie, inc.
