# Welcome to Mirascope
The complete toolkit for building LLM-powered applications.
- **Provider-agnostic**: One API for OpenAI, Anthropic, Google, and more
- **Observable by default**: Tracing, versioning, and analytics built-in
- **Production-ready**: Tools, structured outputs, and reliable error handling
## Get Started
### 1. Install Mirascope
<TabbedSection>
<Tab value="uv">
```bash
uv add "mirascope[all]"
```
</Tab>
<Tab value="pip">
```bash
pip install "mirascope[all]"
```
</Tab>
</TabbedSection>
### 2. Set your provider API key
```bash
export OPENAI_API_KEY="your-api-key"
```
<Info>
See [Providers](/docs/learn/llm/providers) for configuration of other providers (Anthropic, Google, etc.).
</Info>
## Your First Agent
Here's a complete agent that uses tools, tracing, and versioning:
```python
from typing import Literal
from opentelemetry.sdk.trace import TracerProvider
from opentelemetry.sdk.trace.export import ConsoleSpanExporter, SimpleSpanProcessor
from mirascope import llm, ops
# Set up OpenTelemetry tracing with a console exporter
provider = TracerProvider()
provider.add_span_processor(SimpleSpanProcessor(ConsoleSpanExporter()))
ops.configure(tracer_provider=provider)
ops.instrument_llm()
@llm.tool
@ops.trace
def calculate(
operation: Literal["add", "subtract", "multiply", "divide"],
a: float,
b: float,
) -> str:
"""Perform a mathematical operation on two numbers."""
match operation:
case "add":
return str(a + b)
case "subtract":
return str(a - b)
case "multiply":
return str(a * b)
case "divide":
return str(a / b) if b != 0 else "Cannot divide by zero"
@ops.version # Automatically versions `math_agent` and traces it's execution
@llm.call("openai/gpt-4o-mini", tools=[calculate])
def math_agent(query: str) -> str:
return f"Help the user with: {query}"
@ops.trace
def run_math_agent(query: str) -> str:
response = math_agent(query)
while response.tool_calls:
tool_outputs = response.execute_tools()
response = response.resume(tool_outputs)
return response.text()
print(run_math_agent("What's 42 * 17?"))
```
This example shows the core Mirascope patterns:
- **Tools** let the LLM call your functions (traced with `@ops.trace`)
- **Versioning** with `@ops.version` tracks changes to your prompts and traces automatically
- **Tracing** on `run_agent` captures the entire agent loop as a nested trace
## What's Next
| Learning Path | Topics |
| --- | --- |
| [LLM Quickstart](/docs/quickstart) | Messages, Calls, Tools, Structured Output, Streaming, Agents |
| [Ops Overview](/docs/learn/ops) | Configuration, Tracing, Sessions, Versioning |
| [API Reference](/docs/api) | Full API documentation |