Skip to content

fix(bedrock): omit topP for Anthropic Converse; use AWSBedrockConfig in LlmFactory#4469

Merged
kartik-mem0 merged 3 commits intomem0ai:mainfrom
Himanshu-Sangshetti:fix/bedrock-converse-inference-config
Mar 25, 2026
Merged

fix(bedrock): omit topP for Anthropic Converse; use AWSBedrockConfig in LlmFactory#4469
kartik-mem0 merged 3 commits intomem0ai:mainfrom
Himanshu-Sangshetti:fix/bedrock-converse-inference-config

Conversation

@Himanshu-Sangshetti
Copy link
Copy Markdown
Contributor

Description

AWS Bedrock Converse was sending temperature and topP together in inferenceConfig on every call. Newer Anthropic Claude models on Bedrock reject that combination with:

ValidationException: temperature and top_p cannot both be specified

This PR:

  • Sets top_p on AWSBedrockConfig to None by default and only includes top_p in get_model_config() when the user sets it, so topP is not implied by default.
  • Introduces _build_inference_config() so Converse inferenceConfig is built in one place: for Anthropic, only temperature and maxTokens; topP is omitted. For other providers (e.g. Nova), topP is added only when top_p is present in the model config (including via model_kwargs).
  • Updates _prepare_input() so invoke_model request bodies only add top_p / topP when configured.
  • Maps aws_bedrock in LlmFactory to AWSBedrockConfig instead of BaseLlmConfig, so Bedrock-specific options (e.g. aws_region) are applied when creating the LLM from config.
  • Adds tests/llms/test_aws_bedrock.py covering factory wiring, inference profiles, and Converse inferenceConfig shape.

Dependencies: No new dependencies. Bedrock usage continues to require boto3 (existing requirement for this provider).

Issues this PR addresses

Issue What was wrong What we changed
#3891 converse always sent both temperature and topPValidationException on Claude Sonnet 4.5+ _build_inference_config() omits topP for Anthropic; top_p defaults to unset in AWSBedrockConfig so it isn’t always serialized
#4004 Same Converse error for Anthropic / Claude 4.x / Haiku 4.5-style models; **top_p always implied by default Same as above; inferenceConfig for Anthropic omits topP
#4136 LlmFactory used BaseLlmConfig for aws_bedrock → Bedrock-only keys (e.g. aws_region) not applied provider_to_class now uses AWSBedrockConfig
#4138 Claude 4.x / Bedrock support (inference profiles, Converse) Same fixes: correct inferenceConfig + factory config class; extract_provider already handles us. / eu. / … profile IDs (covered in tests)

Type of change

  • Bug fix (non-breaking change which fixes an issue)

How Has This Been Tested?

  • Ran pytest tests/llms/test_aws_bedrock.py -v locally; all tests passed.
  • Tests use a mock Bedrock client and check that:
    • converse is called with an inferenceConfig that does not include topP for Anthropic models (including us.anthropic… inference profile IDs), for both normal and tool flows.
    • LlmFactory builds AWSBedrockConfig and accepts aws_region in the config dict.
    • AWSBedrockConfig only includes top_p in the model config when it is set.

Checklist:

  • My code follows the style guidelines of this project
  • I have performed a self-review of my own code
  • I have commented my code, particularly in hard-to-understand areas
  • I have made corresponding changes to the documentation
  • My changes generate no new warnings
  • I have added tests that prove my fix is effective or that my feature works
  • New and existing unit tests pass locally with my changes
  • Any dependent changes have been merged and published in downstream modules
  • I have checked my code and corrected any misspellings

Maintainer Checklist

"temperature": self.model_config.get("temperature", 0.1),
"top_p": self.model_config.get("top_p", 0.9),
}
if self.model_config.get("top_p") is not None:
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Instead of defining this conditional multiple times we can create a sanitization function that does it for us in a cleaner manner

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks and done. I pulled the repeated “only add top_p if it’s set” logic into one small helper (_merge_optional_top_p) and use that everywhere in _prepare_input instead of copying the same if over

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

3 participants