Skip to content

Latest commit

 

History

History

Folders and files

NameName
Last commit message
Last commit date

parent directory

..
 
 
 
 
 
 
 
 

README.md

Basic Example

This example demonstrates how to run the Google Calendar Agent with the Inference Gateway using Docker Compose. The setup includes both services configured to work together, providing a complete AI-powered calendar management solution.

Architecture

Access with A2A Debugger or CLI

┌─────────────────┐    ┌─────────────────┐    ┌─────────────────┐
│                 │    │                 │    │                 |
│ A2A Debugger    │───▶│ Calendar Agent  │───▶│ Google Calendar │
│ CLI Tool        │    │ (A2A Server)    │    │ API             │
│                 │    │ (Port 8080)     │    │                 │
└─────────────────┘    └─────────────────┘    └─────────────────┘
        │                       │
        │                       │
        ▼                       ▼
┌─────────────────┐    ┌─────────────────┐
│                 │    │                 │
│ Task Management │    │ LLM Engine      │
│ • List tasks    │    │ (Optional)      │
│ • View history  │    │ For AI features │
│ • Submit tasks  │    │                 │
└─────────────────┘    └─────────────────┘

Direct Access Flow:

  1. A2A Debugger connects directly to the Calendar Agent's A2A server endpoint
  2. Submit Tasks: Send JSON-RPC 2.0 requests directly to the agent
  3. Task Management: List, monitor, and retrieve task details and conversation history
  4. Real-time Debugging: View agent responses, tool calls, and execution flow
  5. Agent Tools: Direct access to calendar operations without gateway overhead

A2A Debugger Commands:

# Test connection and get agent capabilities
docker compose run --rm a2a-debugger connect
docker compose run --rm a2a-debugger agent-card

# Submit tasks directly to the agent
docker compose run --rm a2a-debugger tasks submit "List my events for today"
docker compose run --rm a2a-debugger tasks submit "Create a meeting tomorrow at 2 PM"
docker compose run --rm a2a-debugger tasks submit-streaming "List my events for today?"

# Monitor and debug
docker compose run --rm a2a-debugger tasks list
docker compose run --rm a2a-debugger tasks get <task-id>
docker compose run --rm a2a-debugger tasks history <context-id>

Benefits of Direct Access:

  • Faster Response: No gateway overhead for debugging
  • Direct Tool Access: Immediate access to calendar operations
  • Enhanced Debugging: Full visibility into agent internals
  • Task Monitoring: Real-time task status and conversation history
  • Development Workflow: Perfect for agent development and testing

Features

  • Google Calendar Agent: Manages calendar events with natural language processing
  • Inference Gateway: High-performance LLM gateway supporting multiple providers
  • Multi-Provider Support: OpenAI, Groq, Anthropic, DeepSeek, Cohere, Cloudflare
  • Mock Mode: Run without Google Calendar integration for testing
  • Health Checks: Built-in health monitoring for both services
  • Automatic Restart: Services restart automatically on failure

Prerequisites

  • Docker and Docker Compose installed
  • Google Calendar API credentials (unless running in mock mode)
  • API keys for at least one LLM provider

Quick Start

1. Clone and Setup

# Navigate to the example directory
cd example

# Copy the environment template
cp .env.example .env

2. Configure Environment Variables

Edit the .env file and configure the required settings:

For Demo Mode (No Google Calendar Integration)

# Set mock mode to true
GOOGLE_MOCK_MODE=true

# Configure at least one LLM provider
GROQ_API_KEY=your_groq_api_key_here
A2A_AGENT_CLIENT_PROVIDER=groq
A2A_AGENT_CLIENT_MODEL=deepseek-r1-distill-llama-70b

For Production Mode (With Google Calendar)

# Disable mock mode
GOOGLE_MOCK_MODE=false

# Configure Google Calendar
GOOGLE_CALENDAR_ID=primary
GOOGLE_SERVICE_ACCOUNT_JSON={"type":"service_account","project_id":"..."}

# Configure LLM provider
GROQ_API_KEY=your_groq_api_key_here
A2A_AGENT_CLIENT_PROVIDER=groq
A2A_AGENT_CLIENT_MODEL=deepseek-r1-distill-llama-70b

3. Start the Services

# Using Task (recommended)
task up

# Or using Docker Compose directly
docker-compose up -d

# View logs
task logs
# or
docker-compose logs -f

# Check service status
task status
# or
docker-compose ps

4. Test the Setup

Check Health Status

# Test Inference Gateway
curl http://localhost:8080/health

Get Agent Information

Set on the Inference Gateway A2A_EXPOSE=true and bring up the containers.

curl http://localhost:8080/v1/a2a/agents

Test Chat Completions

# Test through Inference Gateway (non-streaming)
curl -X POST http://localhost:8080/v1/chat/completions \
  -H "Content-Type: application/json" \
  -d '{
    "model": "deepseek/deepseek-chat",
    "messages": [
      {
        "role": "user",
        "content": "List my events for today"
      }
    ]
  }'

# Test through Inference Gateway (streaming)
curl -N -X POST http://localhost:8080/v1/chat/completions \
  -H "Content-Type: application/json" \
  -d '{
    "model": "deepseek/deepseek-chat",
    "messages": [
      {
        "role": "user",
        "content": "List my events for today"
      }
    ],
    "stream": true
  }'

Test with Inference Gateway CLI (Recommended for Convenience)

For the easiest interaction experience, use the inference-gateway CLI which provides an interactive chat interface:

# Start an interactive chat session with the agent (most convenient)
docker compose run --rm cli

# Alternative: Run a one-off command
docker compose run --rm cli agent "What events do I have today?"

# Alternative: Start in interactive chat mode
docker compose run --rm cli chat

CLI Benefits:

  • Interactive Experience: Natural conversation flow instead of curl commands
  • Automatic Formatting: Responses are properly formatted and easy to read
  • Session Management: Maintains conversation context across multiple queries
  • Real-time Streaming: See responses as they're generated
  • Command History: Use arrow keys to navigate previous commands
  • Error Handling: Clear error messages and retry options

The CLI is perfect for:

  • Testing agent functionality interactively
  • Debugging agent responses in real-time
  • Validating that the agent integration is working correctly
  • Daily usage without technical complexity

Available Tasks

This example includes a Taskfile for easy management. Here are the available commands:

# Service management
task up                 # Start all services
task down               # Stop all services
task restart            # Restart all services
task status             # Show service status

# Monitoring
task logs               # Show logs for all services
task logs-gateway       # Show logs for inference gateway only
task logs-agent         # Show logs for calendar agent only
task health             # Check health of all services

# Testing
task test-gateway       # Test Inference Gateway
task test-agent         # Test Calendar Agent directly
task agent-info         # Get agent information

# Maintenance
task clean              # Stop services and remove volumes
task clean-all          # Stop services and remove everything
task pull               # Pull latest images

# Modes
task demo               # Start in demo mode
task prod               # Start in production mode
task debug              # Start with debug logging

# Validation
task validate-env       # Check environment configuration

Configuration Options

Google Calendar Configuration

Environment Variable Description Default Required
GOOGLE_MOCK_MODE Run without Google Calendar integration false No
GOOGLE_APPLICATION_CREDENTIALS Path to credentials file - Yes*
GOOGLE_CALENDAR_ID Target calendar ID primary No
GOOGLE_CALENDAR_TIMEZONE Default timezone UTC No

* Required unless GOOGLE_MOCK_MODE=true

Supported LLM Providers

Groq (Recommended for Speed)

GROQ_API_KEY=your_groq_api_key
A2A_AGENT_CLIENT_PROVIDER=groq
A2A_AGENT_CLIENT_MODEL=deepseek-r1-distill-llama-70b

OpenAI

OPENAI_API_KEY=your_openai_api_key
A2A_AGENT_CLIENT_PROVIDER=openai
A2A_AGENT_CLIENT_MODEL=gpt-4o

Anthropic

ANTHROPIC_API_KEY=your_anthropic_api_key
A2A_AGENT_CLIENT_PROVIDER=anthropic
A2A_AGENT_CLIENT_MODEL=claude-3-opus-20240229

DeepSeek (Cost-Effective)

DEEPSEEK_API_KEY=your_deepseek_api_key
A2A_AGENT_CLIENT_PROVIDER=deepseek
A2A_AGENT_CLIENT_MODEL=deepseek-chat

Cohere

COHERE_API_KEY=your_cohere_api_key
A2A_AGENT_CLIENT_PROVIDER=cohere
A2A_AGENT_CLIENT_MODEL=command-r-plus

Cloudflare Workers AI

CLOUDFLARE_API_TOKEN=your_cloudflare_token
CLOUDFLARE_ACCOUNT_ID=your_account_id
A2A_AGENT_CLIENT_PROVIDER=cloudflare
A2A_AGENT_CLIENT_MODEL=@cf/meta/llama-3.1-8b-instruct

Google Calendar Setup

Option 1: Service Account (Recommended)

  1. Go to Google Cloud Console
  2. Create a new project or select existing one
  3. Enable the Google Calendar API
  4. Create a Service Account
  5. Download the JSON credentials file
  6. Share your calendar with the service account email
  7. Set GOOGLE_SERVICE_ACCOUNT_JSON to the JSON content (single line)

API Usage Examples

List Calendar Events

# Through Inference Gateway (non-streaming)
curl -X POST http://localhost:8080/v1/chat/completions \
  -H "Content-Type: application/json" \
  -d '{
    "model": "deepseek/deepseek-chat",
    "messages": [
      {
        "role": "user",
        "content": "What events do I have this week?"
      }
    ]
  }'

# Through Inference Gateway (streaming - real-time response)
curl -N -X POST http://localhost:8080/v1/chat/completions \
  -H "Content-Type: application/json" \
  -d '{
    "model": "deepseek/deepseek-chat",
    "messages": [
      {
        "role": "user",
        "content": "What events do I have this week?"
      }
    ],
    "stream": true
  }'

Create Calendar Event

# Through Inference Gateway (non-streaming)
curl -X POST http://localhost:8080/v1/chat/completions \
  -H "Content-Type: application/json" \
  -d '{
    "model": "deepseek/deepseek-chat",
    "messages": [
      {
        "role": "user",
        "content": "Schedule a team meeting tomorrow at 2 PM for 1 hour"
      }
    ]
  }'

# Through Inference Gateway (streaming - real-time response)
curl -N -X POST http://localhost:8080/v1/chat/completions \
  -H "Content-Type: application/json" \
  -d '{
    "model": "deepseek/deepseek-chat",
    "messages": [
      {
        "role": "user",
        "content": "Schedule a team meeting tomorrow at 2 PM for 1 hour"
      }
    ],
    "stream": true
  }'

Update Calendar Event

# Through Inference Gateway (non-streaming)
curl -X POST http://localhost:8080/v1/chat/completions \
  -H "Content-Type: application/json" \
  -d '{
    "model": "deepseek/deepseek-chat",
    "messages": [
      {
        "role": "user",
        "content": "Move my 2 PM meeting to 3 PM"
      }
    ]
  }'

# Through Inference Gateway (streaming - real-time response)
curl -N -X POST http://localhost:8080/v1/chat/completions \
  -H "Content-Type: application/json" \
  -d '{
    "model": "deepseek/deepseek-chat",
    "messages": [
      {
        "role": "user",
        "content": "Move my 2 PM meeting to 3 PM"
      }
    ],
    "stream": true
  }'

Delete Calendar Event

# Through Inference Gateway (non-streaming)
curl -X POST http://localhost:8080/v1/chat/completions \
  -H "Content-Type: application/json" \
  -d '{
    "model": "deepseek/deepseek-chat",
    "messages": [
      {
        "role": "user",
        "content": "Cancel my meeting with John tomorrow"
      }
    ]
  }'

# Through Inference Gateway (streaming - real-time response)
curl -N -X POST http://localhost:8080/v1/chat/completions \
  -H "Content-Type: application/json" \
  -d '{
    "model": "deepseek/deepseek-chat",
    "messages": [
      {
        "role": "user",
        "content": "Cancel my meeting with John tomorrow"
      }
    ],
    "stream": true
  }'

Troubleshooting

Common Issues

Services Won't Start

# Check logs for errors
docker-compose logs

# Restart services
docker-compose down
docker-compose up -d

Health Checks Failing

# Check service status
docker-compose ps

# Test connectivity
curl http://localhost:8080/health

Google Calendar Authentication Issues

  • Verify credentials are correctly formatted
  • Ensure calendar is shared with service account
  • Check API quotas in Google Cloud Console

LLM Provider Issues

  • Verify API keys are correct
  • Check provider-specific rate limits
  • Try different models if current one fails

Debug Mode

Enable debug logging for more detailed output:

# In .env file
LOG_LEVEL=debug
SERVER_GIN_MODE=debug

Viewing Logs

# All services
docker-compose logs -f

# Specific service
docker-compose logs -f google-calendar-agent
docker-compose logs -f inference-gateway

# Last 100 lines
docker-compose logs --tail=100

Cleanup

# Stop services
docker-compose down

# Remove volumes and networks
docker-compose down -v

# Remove images (optional)
docker-compose down --rmi all

Security Considerations

  • Store API keys securely (use Docker secrets in production)
  • Use HTTPS in production environments
  • Regularly rotate API keys
  • Limit Google Calendar permissions to necessary scopes
  • Monitor API usage and set up alerts

Production Deployment

For production deployments, consider:

  • Using Docker secrets for sensitive data
  • Setting up reverse proxy with SSL termination
  • Implementing proper monitoring and logging
  • Using managed databases for persistence
  • Setting up automated backups
  • Implementing health check endpoints

Support

For issues and questions: