feat(bedrock): add MiniMax provider support for AWS Bedrock#4609
feat(bedrock): add MiniMax provider support for AWS Bedrock#4609whysosaket merged 3 commits intomem0ai:mainfrom
Conversation
- Add 'minimax' to PROVIDERS list in aws_bedrock.py - Add MiniMax branch in _generate_standard() using Bedrock Converse API - Handle MiniMax M2.5 reasoning model response: content array may contain a reasoningContent block before the actual text block, so iterate to find the first block with a 'text' key - Add unit tests: extract_provider, standard response, reasoning model response (skips reasoningContent), and inferenceConfig validation
…e API Bedrock Converse API requires system messages to be passed via the top-level 'system' parameter, not as a role='system' entry in messages. The previous implementation discarded system messages entirely (only taking the last user message), causing mem0's fact-extraction prompt to be ignored and MiniMax to return unstructured text instead of JSON. Fix: iterate all messages, split system vs user/assistant, pass system content via converse_params['system']. Add test for system prompt routing.
kartik-mem0
left a comment
There was a problem hiding this comment.
Thanks for the PR! The implementation looks solid and the tests are thorough. A few questions before we merge:
-
Reusing
_build_inference_config()— I noticed the minimax branch builds its owninferenceConfigdict inline. We already have_build_inference_config()that does essentially the same thing — any reason not to reuse it here? Would keep things consistent and avoid the two drifting apart over time. -
topPbeing excluded — The tests explicitly assert thattopPisn't in the inference config, but there's no comment explaining why. Does MiniMax reject it like Anthropic does? If so, a quick one-liner comment would save the next person from wondering whether this was intentional or just missed. -
Tool support — MiniMax isn't added to the
supports_toolspath. Is that because the model doesn't support tool use on Bedrock yet, or was it just out of scope for this PR? Either way is fine, just curious — and if it's a known limitation, a brief comment in the code would be helpful.
…P/tool comments - Extend _build_inference_config() to skip topP for 'minimax' (same rule as 'anthropic': MiniMax M2.x reasoning models reject temperature+topP simultaneously on the Converse API) - Replace inline inferenceConfig dict in the minimax branch with a call to _build_inference_config() so both providers stay in sync - Add inline comment on supports_tools explaining why MiniMax is excluded: tool use is only available via the bedrock-mantle endpoint, not bedrock-runtime Converse API - Add two TestBuildInferenceConfig unit tests for MiniMax topP behaviour
norrishuang
left a comment
There was a problem hiding this comment.
Thanks for the thorough review, @kartik-mem0!
1. Reusing _build_inference_config()
Good catch. The reason I originally built the inferenceConfig inline for MiniMax was that the existing _build_inference_config() only skipped topP for anthropic — if a user had top_p configured, any other provider would receive topP, including MiniMax. Since MiniMax M2.x (like Anthropic) rejects requests that include both temperature and topP simultaneously on the Bedrock Converse API, that would cause a runtime error.
I've now extended _build_inference_config() to also skip topP for minimax, and updated the MiniMax branch to call it — so both providers share the same logic and can't drift apart.
2. Why topP is excluded — missing comment
You're right, I missed the explanation. MiniMax M2.x are reasoning models and behave like Anthropic in this regard: the Bedrock Converse API returns a ValidationException if both temperature and topP are present in inferenceConfig. I've added an inline comment to _build_inference_config() and updated the docstring to make this explicit. Also added two new unit tests to TestBuildInferenceConfig covering the MiniMax topP omission behaviour.
3. Tool support
MiniMax M2.5 on Bedrock does support tool calling, but only via the bedrock-mantle endpoint (OpenAI-compatible Chat Completions API). This class uses bedrock-runtime (Converse API), which does not support tool use for MiniMax per the AWS model card. So leaving MiniMax out of supports_tools is correct for this code path. I've added a comment on the supports_tools line explaining the limitation.
All changes are in the latest commit (f3be01c). 48 tests passing.
Summary
Add support for MiniMax models (e.g.
minimax.minimax-m2.5) on AWS Bedrock.Problem
The
PROVIDERSallowlist inaws_bedrock.pydid not includeminimax, causing aValueError: Unknown provider in modelwhen users tried to configure a MiniMax model via theaws_bedrockLLM provider.Changes
mem0/llms/aws_bedrock.py"minimax"to thePROVIDERSlistminimaxbranch in_generate_standard()that calls the Bedrock Converse APIcontentarray may contain areasoningContentblock before the actualtextblock. The implementation iterates over content blocks and returns the first one that contains a"text"key, safely skipping anyreasoningContentblocks.tests/llms/test_aws_bedrock.pyAdded
TestMiniMaxProviderclass with 4 unit tests:test_extract_provider— verifiesminimax.*model IDs are parsed correctlytest_generate_response_text_only— standard single-text-block responsetest_generate_response_reasoning_model— response withreasoningContent+textblocks; asserts the reasoning block is skippedtest_inference_config— verifiesmaxTokens,temperatureare passed;topPis not includedTesting
All 4 new tests pass. Existing test suite unaffected.
Configuration