Skip to content

RobotLab

[!CAUTION] This gem is under active development. APIs and features may change without notice. See the CHANGELOG for details.

RobotLab
"Build robots. Solve problems."
RobotLab is a Ruby gem that enables you to build sophisticated AI applications using multiple specialized robots (LLM agents) that work together to accomplish complex tasks.

Each robot is backed by a persistent LLM chat, configured with keyword arguments, and run with a simple positional message. Robots can be orchestrated through networks with task-based pipelines, share information through a reactive memory system, and connect to external tools via the Model Context Protocol (MCP).

Key Features

  • Multi-Robot Architecture


    Build applications with multiple specialized Robots (AI agents), each with persistent chat and memory.

    Learn more

  • Network Orchestration


    Connect robots in task-based pipelines using SimpleFlow with sequential, parallel, and conditional execution.

    Creating Networks

  • Prompt Templates


    Self-contained .md files with YAML front matter that define a complete robot: prompt, tools, MCP, model, and skills.

    Building Robots

  • Composable Skills


    Mix reusable prompt behaviors into any robot. Skills expand depth-first with automatic cycle detection and config cascading.

    Skills Guide

  • Extensible Tools


    Give robots custom capabilities via RobotLab::Tool subclasses with graceful error handling. Errors are returned to the LLM as plain text.

    Using Tools

  • Human-in-the-Loop


    The AskUser tool lets robots ask users questions interactively with open-ended text, multiple choice, and default values.

    Using Tools

  • Content Streaming


    Stream LLM responses in real-time via stored on_content: callbacks, per-call blocks, or both together.

    Streaming Guide

  • MCP Integration


    Connect to Model Context Protocol servers to extend robot capabilities with external tools.

    MCP Guide

  • Reactive Memory


    Robots share data through a reactive key-value memory system with subscriptions, blocking reads, and optional Redis backend.

    Memory System

  • Message Bus


    Bidirectional, cyclic communication between robots via TypedBus for negotiation loops and convergence patterns.

    Message Bus

  • Dynamic Spawning


    Robots create new specialist robots at runtime using spawn. The bus is created lazily with no upfront wiring required.

    Examples

  • Layered Configuration


    Cascading config from YAML files, environment variables, and RunConfig objects that flow through the network-robot hierarchy.

    Configuration

  • Rails Integration


    Generators, background jobs, and Turbo Stream token broadcasting for real-time streaming to the browser.

    Rails Guide

Quick Example

require "robot_lab"

# Configuration is automatic via environment variables, YAML files, or defaults.
# Set API keys via env vars:
#   ROBOT_LAB_RUBY_LLM__ANTHROPIC_API_KEY=sk-ant-...
#
# Or place a config file at ~/.config/robot_lab/config.yml
# Access config values: RobotLab.config.ruby_llm.model  #=> "claude-sonnet-4"

# Create a robot with keyword arguments
robot = RobotLab.build(
  name: "assistant",
  system_prompt: "You are a helpful assistant. Answer questions clearly and concisely.",
  model: "claude-sonnet-4"
)

# Run the robot with a positional string argument
result = robot.run("What is the capital of France?")

puts result.last_text_content
# => "The capital of France is Paris."

# Memory persists across runs
robot.run("Remember that my favorite color is blue.")
result = robot.run("What is my favorite color?")
puts result.last_text_content
# => "Your favorite color is blue."

# Chaining configuration
robot.with_instructions("Be extra concise.").with_temperature(0.3).run("Explain Ruby in one sentence.")

Supported LLM Providers

RobotLab supports multiple LLM providers through the ruby_llm library:

Provider Models
Anthropic Claude Opus 4, Claude Sonnet 4, Claude Haiku 3.5
OpenAI GPT-4o, GPT-4, o1, o3
Google Gemini 2.5 Pro, Gemini 2.5 Flash
DeepSeek DeepSeek V3, DeepSeek R1
AWS Bedrock Claude models via AWS Bedrock
Google Vertex AI Gemini models via Vertex AI
Ollama Local models via Ollama
OpenRouter Multi-provider routing
Mistral Mistral Large, Mistral Medium
xAI Grok models

Installation

Add RobotLab to your Gemfile:

gem "robot_lab"

Or install directly:

gem install robot_lab

Full Installation Guide

Next Steps

License

RobotLab is released under the MIT License.