fix(bedrock): omit topP for Anthropic Converse; use AWSBedrockConfig in LlmFactory#4469
Merged
kartik-mem0 merged 3 commits intomem0ai:mainfrom Mar 25, 2026
Conversation
whysosaket
reviewed
Mar 21, 2026
mem0/llms/aws_bedrock.py
Outdated
| "temperature": self.model_config.get("temperature", 0.1), | ||
| "top_p": self.model_config.get("top_p", 0.9), | ||
| } | ||
| if self.model_config.get("top_p") is not None: |
Member
There was a problem hiding this comment.
Instead of defining this conditional multiple times we can create a sanitization function that does it for us in a cleaner manner
Contributor
Author
There was a problem hiding this comment.
Thanks and done. I pulled the repeated “only add top_p if it’s set” logic into one small helper (_merge_optional_top_p) and use that everywhere in _prepare_input instead of copying the same if over
kartik-mem0
approved these changes
Mar 25, 2026
This was referenced Mar 25, 2026
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Description
AWS Bedrock Converse was sending
temperatureandtopPtogether ininferenceConfigon every call. Newer Anthropic Claude models on Bedrock reject that combination with:ValidationException: temperature and top_p cannot both be specifiedThis PR:
top_ponAWSBedrockConfigtoNoneby default and only includestop_pinget_model_config()when the user sets it, sotopPis not implied by default._build_inference_config()so ConverseinferenceConfigis built in one place: for Anthropic, onlytemperatureandmaxTokens;topPis omitted. For other providers (e.g. Nova),topPis added only whentop_pis present in the model config (including viamodel_kwargs)._prepare_input()soinvoke_modelrequest bodies only addtop_p/topPwhen configured.aws_bedrockinLlmFactorytoAWSBedrockConfiginstead ofBaseLlmConfig, so Bedrock-specific options (e.g.aws_region) are applied when creating the LLM from config.tests/llms/test_aws_bedrock.pycovering factory wiring, inference profiles, and ConverseinferenceConfigshape.Dependencies: No new dependencies. Bedrock usage continues to require
boto3(existing requirement for this provider).Issues this PR addresses
conversealways sent bothtemperatureandtopP→ValidationExceptionon Claude Sonnet 4.5+_build_inference_config()omitstopPfor Anthropic;top_pdefaults to unset inAWSBedrockConfigso it isn’t always serializedLlmFactoryusedBaseLlmConfigforaws_bedrock→ Bedrock-only keys (e.g.aws_region) not appliedprovider_to_classnow usesAWSBedrockConfiginferenceConfig+ factory config class;extract_provideralready handlesus./eu./ … profile IDs (covered in tests)Type of change
How Has This Been Tested?
pytest tests/llms/test_aws_bedrock.py -vlocally; all tests passed.converseis called with aninferenceConfigthat does not includetopPfor Anthropic models (includingus.anthropic…inference profile IDs), for both normal and tool flows.LlmFactorybuildsAWSBedrockConfigand acceptsaws_regionin the config dict.AWSBedrockConfigonly includestop_pin the model config when it is set.Checklist:
Maintainer Checklist
temperatureandtop_pcannot both be specified #3891