AI Agents

Cainty's AI agent system lets you automate content creation using 6 LLM providers and 24+ models.

How It Works

  1. Create an agent — Choose a provider, model, and system prompt
  2. Execute — Run manually or on a schedule
  3. Review — Generated content enters the queue for review
  4. Publish — Approve, edit, or reject from the content queue

Supported Providers

ProviderExample ModelsConfig Key
AnthropicClaude Sonnet 4.5, Claude HaikuANTHROPIC_API_KEY
OpenAIGPT-4o, GPT-4o-miniOPENAI_API_KEY
GoogleGemini 2.0 Flash, Gemini ProGOOGLE_API_KEY
DeepSeekDeepSeek Chat, DeepSeek ReasonerDEEPSEEK_API_KEY
xAIGrok 2, Grok 3XAI_API_KEY
OllamaAny local modelOLLAMA_BASE_URL

Setting Up API Keys

You can configure API keys two ways:

  • .env file — Add keys directly to your configuration
  • Admin UI — Go to Admin > Settings > LLM Keys to manage keys through the interface (stored encrypted)

Creating an Agent

Navigate to Admin > Agents > New Agent and configure:

  • Name — A descriptive name for the agent
  • Provider — Which LLM service to use
  • Model — Specific model from the provider
  • System Prompt — Instructions that define the agent's behavior
  • User Prompt Template — Template for generating content (supports variables)
  • Target Category — Where generated posts will be categorized

Content Queue

All AI-generated content goes through a review queue before publishing:

  • Pending — Newly generated, awaiting review
  • Approved — Published as a post
  • Rejected — Discarded

Review content at Admin > Content Queue. You can edit the title, body, and category before approving.

Agent Memory

Agents maintain memory across runs. This helps them:

  • Avoid generating duplicate topics
  • Build on previous content
  • Maintain consistent voice and style

Run History

Every agent execution is logged. View run history at Admin > Agents > Runs to see:

  • Execution timestamp
  • Provider and model used
  • Success or failure status
  • Token usage and response time