Skip to content

Bernstein

Orchestrate any AI coding agent. Any model. One command.

Bernstein in action — parallel AI agents orchestrated in real time

Bernstein in action — parallel AI agents orchestrated in real time

Bernstein takes a goal, breaks it into tasks, assigns them to AI coding agents running in parallel, verifies the output, and merges the results. You come back to working code, passing tests, and a clean git history.

No framework to learn. No vendor lock-in. Agents are interchangeable workers — swap any agent, any model, any provider. The orchestrator itself is deterministic Python code. Zero LLM tokens on scheduling.

Install

pip install bernstein
pipx install bernstein
uv tool install bernstein
brew tap chernistry/tap
brew install bernstein

Then run:

bernstein -g "Add JWT auth with refresh tokens, tests, and API docs"

Why Bernstein?

  • Deterministic scheduling


    Pure Python orchestration — zero LLM tokens on coordination. Every decision is auditable code, not a model response.

  • Any agent, any model


    42 CLI adapters: Claude Code, Codex, OpenAI Agents SDK v2, Gemini, Cursor, Aider, Cloudflare Agents, GitHub Copilot, Droid, Crush, and more. Mix cheap local models with cloud models in the same run.

  • Git worktree isolation


    Each agent works in its own git worktree. No merge conflicts. Clean history. Parallel by default.

  • Built-in verification


    Janitor system checks tests, lint, types, and PII before any agent output lands in your codebase.

Install Get Bernstein installed and verify it runs
First run Take Bernstein from "installed" to "first orchestrated task complete"
Configuration bernstein.yaml reference
Adapter Guide Supported agents and how to add your own
API Reference Task server REST API
Architecture How Bernstein works under the hood
Lifecycle FSM Task and agent state machines with transition tables
What's New Summary of recent releases (1.8 → 1.9)
Changelog Full release history

Created by Alex Chernysh (@chernistry).