Skip to content

fix(bedrock): encode model arns for OpenAI compatible bedrock imported models#21701

Merged
Sameerlite merged 1 commit intoBerriAI:mainfrom
ta-stripe:fix/bedrock-openai-imported-model-name-encoding
Feb 23, 2026
Merged

fix(bedrock): encode model arns for OpenAI compatible bedrock imported models#21701
Sameerlite merged 1 commit intoBerriAI:mainfrom
ta-stripe:fix/bedrock-openai-imported-model-name-encoding

Conversation

@ta-stripe
Copy link
Copy Markdown
Contributor

@ta-stripe ta-stripe commented Feb 20, 2026

Relevant issues

Fixes error in #17097 when OpenAI compatible Bedrock imported models were introduced.

Pre-Submission checklist

Please complete all items before asking a LiteLLM maintainer to review your PR

  • I have Added testing in the tests/litellm/ directory, Adding at least 1 test is a hard requirement - see details
  • My PR passes all unit tests on make test-unit
  • My PR's scope is as isolated as possible, it only solves 1 specific problem
  • I have requested a Greptile review by commenting @greptileai and received a Confidence Score of at least 4/5 before requesting a maintainer review

CI (LiteLLM team)

CI status guideline:

  • 50-55 passing tests: main is stable with minor issues.
  • 45-49 passing tests: acceptable but needs attention
  • <= 40 passing tests: unstable; be careful with your merges and assess the risk.
  • Branch creation CI run
    Link:

  • CI run for the last commit
    Link:

  • Merge / cherry-pick CI run
    Links:

Type

🐛 Bug Fix

Changes

This PR fixes a bug in how Bedrock model ARNs are sent when using the bedrock/openai/ provider route. Model IDs that are ARNs (e.g. arn:aws:bedrock:...:imported-model/xyz123xyz123) must be URL-encoded when used in the request path; otherwise the AWS API misparses the path and returns an UnknownOperationException error.

# AWS Accepts
POST /bedrock/model/arn:aws:bedrock:us-west-2:123456789000:imported-model%2Fxyz123xyz123/invoke
# AWS return com.amazon.coral.service#UnknownOperationException 
POST /bedrock/model/arn:aws:bedrock:us-west-2:123456789000:imported-model/xyz123xyz123/invoke

This only affects Bedrock models that use the OpenAI-compatible Runtime API (e.g. imported Qwen2.5, Qwen2-VL, Qwen2.5-VL, and GPT-OSS). For models that are invoked through the Converse API, the model arns are automatically encoded in converse_handler.py; and the Runtime invoke path did not, so ARNs containing / (e.g. :imported-model/...) were sent unencoded and triggered the error.

@vercel
Copy link
Copy Markdown

vercel Bot commented Feb 20, 2026

The latest updates on your projects. Learn more about Vercel for GitHub.

Project Deployment Actions Updated (UTC)
litellm Ready Ready Preview, Comment Feb 20, 2026 9:08pm

Request Review

@greptile-apps
Copy link
Copy Markdown
Contributor

greptile-apps Bot commented Feb 20, 2026

Greptile Summary

This PR fixes a bug where Bedrock OpenAI-compatible imported models using ARNs (e.g., arn:aws:bedrock:us-east-1:...:imported-model/m4gc1mrfuddy) had their ARN slashes interpreted as URL path separators, causing 500 errors from the Bedrock runtime endpoint. The fix reuses the existing CommonUtils.encode_bedrock_runtime_modelid_arn utility (already used in the passthrough path) to encode the slash after the resource type (e.g., imported-model/ becomes imported-model%2F).

  • Adds a call to CommonUtils.encode_bedrock_runtime_modelid_arn(model_id) in AmazonBedrockOpenAIConfig.get_complete_url() before constructing the invoke URL
  • Updates the existing test assertion to expect the encoded %2F in the URL
  • No new test added in tests/litellm/ directory (required by contribution guidelines)

Confidence Score: 4/5

  • This PR is safe to merge — it applies a proven encoding utility to a missing code path, fixing a real bug with imported model ARNs.
  • The fix is minimal (3 lines of new code) and reuses an existing, well-tested utility (CommonUtils.encode_bedrock_runtime_modelid_arn) that is already used in the bedrock passthrough path. The regex patterns in the utility correctly handle the :imported-model/ pattern. The existing test is updated to validate the fix. Score is 4 instead of 5 because the PR lacks a new unit test in the tests/litellm/ directory as required by contribution guidelines.
  • No files require special attention — the change is straightforward and low-risk.

Important Files Changed

Filename Overview
litellm/llms/bedrock/chat/invoke_transformations/amazon_openai_transformation.py Adds ARN encoding for model IDs using CommonUtils.encode_bedrock_runtime_modelid_arn before constructing the invoke URL. The encoding correctly converts slashes in ARN resource types (e.g., :imported-model/ to :imported-model%2F) to prevent them from being treated as path separators. Import placed at module level per project conventions.
tests/llm_translation/test_bedrock_completion.py Updates existing test assertion to expect URL-encoded slash (%2F) in the imported-model ARN, matching the new encoding behavior. Test validates the fix is working correctly.

Sequence Diagram

sequenceDiagram
    participant Client
    participant LiteLLM as LiteLLM (AmazonBedrockOpenAIConfig)
    participant Encoder as CommonUtils.encode_bedrock_runtime_modelid_arn
    participant Bedrock as AWS Bedrock Runtime

    Client->>LiteLLM: completion(model="bedrock/openai/arn:...:imported-model/abc123")
    LiteLLM->>LiteLLM: _get_openai_model_id() strips prefixes
    Note right of LiteLLM: model_id = "arn:...:imported-model/abc123"
    LiteLLM->>Encoder: encode_bedrock_runtime_modelid_arn(model_id)
    Encoder-->>LiteLLM: "arn:...:imported-model%2Fabc123"
    LiteLLM->>LiteLLM: Build URL: /model/{encoded_model_id}/invoke
    LiteLLM->>Bedrock: POST /model/arn:...:imported-model%2Fabc123/invoke
    Bedrock-->>Client: Response
Loading

Last reviewed commit: 4885b36

Copy link
Copy Markdown
Contributor

@greptile-apps greptile-apps Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

36 files reviewed, 1 comment

Edit Code Review Agent Settings | Greptile

Comment thread debug_responses_import.py Outdated
@ta-stripe ta-stripe force-pushed the fix/bedrock-openai-imported-model-name-encoding branch from 96a3554 to 4885b36 Compare February 20, 2026 21:07
@ta-stripe
Copy link
Copy Markdown
Contributor Author

@greptileai

Copy link
Copy Markdown
Contributor

@greptile-apps greptile-apps Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

2 files reviewed, no comments

Edit Code Review Agent Settings | Greptile

@Sameerlite Sameerlite merged commit 55ee8cd into BerriAI:main Feb 23, 2026
30 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants