Skip to content

[Feature]: add anthropic-vertex provider for Claude models via Google Vertex AI #17277

@sallyom

Description

@sallyom

Summary

Analyzed by Claude Opus 4.6 noreply@anthropic.com

Add a new anthropic-vertex provider that enables accessing Anthropic Claude models through Google Vertex AI, using GCP service account authentication instead of a direct Anthropic API key. This allows organizations with GCP Vertex AI access to use Claude models billed through their GCP project.

Problem to solve

Many enterprise environments access Anthropic models through Google Vertex AI rather than directly through the Anthropic API. This is common when:

  • The organization has a GCP agreement that includes Vertex AI
  • Billing needs to flow through GCP
  • Direct Anthropic API keys are not available, but GCP service accounts are

The amazon-bedrock provider already demonstrates this pattern (Claude through a cloud provider). This feature adds the equivalent for Google Cloud.

Proposed solution

  1. Add @anthropic-ai/vertex-sdk dependency - The official Anthropic Vertex SDK for Node.js provides an AnthropicVertex client that uses the same Messages API format but authenticates via GCP credentials.

  2. Create provider implementation, either a new provider file or extension of the existing anthropic provider that:

  • Uses AnthropicVertex client from @anthropic-ai/vertex-sdk
  • Authenticates via GCP Application Default Credentials (GOOGLE_APPLICATION_CREDENTIALS)
  • Requires GOOGLE_CLOUD_PROJECT (or ANTHROPIC_VERTEX_PROJECT_ID) and GOOGLE_CLOUD_LOCATION env vars
  • Handles Vertex-specific model ID formats (e.g., version suffixes like claude-sonnet-4-5@20250929)
  1. Add auth resolution: In src/agents/model-auth.ts, add an "anthropic-vertex" case in resolveEnvApiKey() that checks for GCP ADC credentials, similar to the existing google-vertex (Gemini) case.

  2. Register Claude models under the new provider: Add Claude models available on Vertex AI (claude-sonnet-4-5, claude-opus-4-5, claude-haiku-3-5, etc.) to the model catalog under the anthropic-vertex provider.

  3. Add to provider config types: Ensure "anthropic-vertex" is recognized as a valid KnownProvider and its API type is registered.

Environment Variables

 GOOGLE_APPLICATION_CREDENTIALS: Path to GCP service account JSON key file
 GOOGLE_CLOUD_PROJECT: GCP project ID with Vertex AI enabled     
 GOOGLE_CLOUD_LOCATION: GCP region (e.g., us-central1, global)  

Usage

Once implemented, users would configure agents with:

  {
    "model": {
      "primary": "anthropic-vertex/claude-sonnet-4-5"
    }
  }

  Or in openclaw.json provider config:

  {
    "models": {
      "providers": {
        "anthropic-vertex": {
          "api": "anthropic-messages",
          "models": []
        }
      }
    }
  }

Prior Art

  • amazon-bedrock provider — Claude models accessed through AWS Bedrock with aws-sdk auth (closest precedent in this codebase)
  • google-vertex provider — Gemini models accessed through Vertex AI with GCP ADC (same auth mechanism, different model family)
  • Anthropic Vertex SDK — @anthropic-ai/vertex-sdk (official Node.js SDK)

Alternatives considered

LiteLLM Proxy

OpenClaw already has built-in support for LiteLLM as a provider (litellm). LiteLLM is an open-source gateway that can route requests to 100+ backends, including Vertex AI. Users could:

  1. Deploy LiteLLM Proxy with a Vertex AI backend:

litellm config.yaml

  model_list:
    - model_name: claude-sonnet-4-5
      litellm_params:
        model: vertex_ai/claude-sonnet-4-5@20250929
        vertex_project: my-gcp-project
        vertex_location: us-central1
  1. Point OpenClaw to the proxy:
  export LITELLM_API_KEY="sk-litellm-key"
  openclaw onboard --auth-choice litellm-api-key

Tradeoffs:
LiteLLM works today with no code changes and adds useful features like cost tracking, logging, and automatic fallbacks. However, it requires deploying and maintaining an additional service, adds latency from the extra network hop, and relies on LiteLLM's own Vertex integration rather than the official @anthropic-ai/vertex-sdk. A native anthropic-vertex provider is preferred for production deployments where minimizing infrastructure dependencies and latency matters.

Impact

This is an opt-in new feature, current users will not be affected

Evidence/examples

No response

Additional information

Analyzed by Claude Opus 4.6 noreply@anthropic.com

Metadata

Metadata

Assignees

No one assigned

    Labels

    enhancementNew feature or request

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions