Skip to content

Installation

Unfault is an open-source CLI. No account required. Download the binary and run it against your code.

Terminal window
mkdir -p ~/.local/bin
curl -L -o ~/.local/bin/unfault https://github.com/unfault/cli/releases/latest/download/unfault-aarch64-apple-darwin
chmod +x ~/.local/bin/unfault

Downloads go to the latest release on GitHub. Put the binary somewhere on your PATH (e.g. ~/.local/bin on Linux/macOS).

Verify it works:

Terminal window
unfault --version

Some features (like AI-powered review summaries) require a configured LLM provider. This is optional. The core review and graph commands work without it.

Supported providers: OpenAI, Anthropic, Ollama, any OpenAI-compatible endpoint.

Terminal window
# OpenAI
unfault config llm openai --model gpt-4o
# Anthropic
unfault config llm anthropic --model claude-3-5-sonnet-latest
# Local Ollama
unfault config llm ollama --model llama3.2

See Configuration for the full reference.

Optional: Configure Observability Integrations

Section titled “Optional: Configure Observability Integrations”

To enrich reviews with SLO data from your cloud platform, set the relevant credentials before running unfault review:

ProviderRequired Variables
GCP Cloud MonitoringApplication Default Credentials (gcloud auth application-default login)
DatadogDD_API_KEY, DD_APP_KEY
DynatraceDT_API_TOKEN, DT_ENVIRONMENT_URL

Run unfault config integrations show to check what’s detected, or unfault config integrations verify to test connectivity.

Quick Start

Run your first review and understand the output. Get started

How It Works

Understand what happens when you run a review. Learn more