Skip to content

feat(bedrock): add MiniMax provider support for AWS Bedrock#4609

Merged
whysosaket merged 3 commits intomem0ai:mainfrom
norrishuang:feature/bedrock-minimax-support
Mar 30, 2026
Merged

feat(bedrock): add MiniMax provider support for AWS Bedrock#4609
whysosaket merged 3 commits intomem0ai:mainfrom
norrishuang:feature/bedrock-minimax-support

Conversation

@norrishuang
Copy link
Copy Markdown
Contributor

Summary

Add support for MiniMax models (e.g. minimax.minimax-m2.5) on AWS Bedrock.

Problem

The PROVIDERS allowlist in aws_bedrock.py did not include minimax, causing a ValueError: Unknown provider in model when users tried to configure a MiniMax model via the aws_bedrock LLM provider.

Changes

mem0/llms/aws_bedrock.py

  • Added "minimax" to the PROVIDERS list
  • Added a minimax branch in _generate_standard() that calls the Bedrock Converse API
  • Special handling for reasoning models: MiniMax M2.5 is a reasoning model. Its Converse API response content array may contain a reasoningContent block before the actual text block. The implementation iterates over content blocks and returns the first one that contains a "text" key, safely skipping any reasoningContent blocks.

tests/llms/test_aws_bedrock.py

Added TestMiniMaxProvider class with 4 unit tests:

  • test_extract_provider — verifies minimax.* model IDs are parsed correctly
  • test_generate_response_text_only — standard single-text-block response
  • test_generate_response_reasoning_model — response with reasoningContent + text blocks; asserts the reasoning block is skipped
  • test_inference_config — verifies maxTokens, temperature are passed; topP is not included

Testing

All 4 new tests pass. Existing test suite unaffected.

tests/llms/test_aws_bedrock.py::TestMiniMaxProvider::test_extract_provider PASSED
tests/llms/test_aws_bedrock.py::TestMiniMaxProvider::test_generate_response_text_only PASSED
tests/llms/test_aws_bedrock.py::TestMiniMaxProvider::test_generate_response_reasoning_model PASSED
tests/llms/test_aws_bedrock.py::TestMiniMaxProvider::test_inference_config PASSED

Configuration

config = {
    "llm": {
        "provider": "aws_bedrock",
        "config": {
            "model": "minimax.minimax-m2.5",
            "temperature": 0.1,
            "max_tokens": 2000,
        }
    }
}

- Add 'minimax' to PROVIDERS list in aws_bedrock.py
- Add MiniMax branch in _generate_standard() using Bedrock Converse API
- Handle MiniMax M2.5 reasoning model response: content array may contain
  a reasoningContent block before the actual text block, so iterate to
  find the first block with a 'text' key
- Add unit tests: extract_provider, standard response, reasoning model
  response (skips reasoningContent), and inferenceConfig validation
…e API

Bedrock Converse API requires system messages to be passed via the
top-level 'system' parameter, not as a role='system' entry in messages.
The previous implementation discarded system messages entirely (only
taking the last user message), causing mem0's fact-extraction prompt
to be ignored and MiniMax to return unstructured text instead of JSON.

Fix: iterate all messages, split system vs user/assistant, pass system
content via converse_params['system']. Add test for system prompt routing.
Copy link
Copy Markdown
Contributor

@kartik-mem0 kartik-mem0 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for the PR! The implementation looks solid and the tests are thorough. A few questions before we merge:

  1. Reusing _build_inference_config() — I noticed the minimax branch builds its own inferenceConfig dict inline. We already have _build_inference_config() that does essentially the same thing — any reason not to reuse it here? Would keep things consistent and avoid the two drifting apart over time.

  2. topP being excluded — The tests explicitly assert that topP isn't in the inference config, but there's no comment explaining why. Does MiniMax reject it like Anthropic does? If so, a quick one-liner comment would save the next person from wondering whether this was intentional or just missed.

  3. Tool support — MiniMax isn't added to the supports_tools path. Is that because the model doesn't support tool use on Bedrock yet, or was it just out of scope for this PR? Either way is fine, just curious — and if it's a known limitation, a brief comment in the code would be helpful.

…P/tool comments

- Extend _build_inference_config() to skip topP for 'minimax' (same rule
  as 'anthropic': MiniMax M2.x reasoning models reject temperature+topP
  simultaneously on the Converse API)
- Replace inline inferenceConfig dict in the minimax branch with a call
  to _build_inference_config() so both providers stay in sync
- Add inline comment on supports_tools explaining why MiniMax is excluded:
  tool use is only available via the bedrock-mantle endpoint, not
  bedrock-runtime Converse API
- Add two TestBuildInferenceConfig unit tests for MiniMax topP behaviour
Copy link
Copy Markdown
Contributor Author

@norrishuang norrishuang left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for the thorough review, @kartik-mem0!

1. Reusing _build_inference_config()

Good catch. The reason I originally built the inferenceConfig inline for MiniMax was that the existing _build_inference_config() only skipped topP for anthropic — if a user had top_p configured, any other provider would receive topP, including MiniMax. Since MiniMax M2.x (like Anthropic) rejects requests that include both temperature and topP simultaneously on the Bedrock Converse API, that would cause a runtime error.

I've now extended _build_inference_config() to also skip topP for minimax, and updated the MiniMax branch to call it — so both providers share the same logic and can't drift apart.

2. Why topP is excluded — missing comment

You're right, I missed the explanation. MiniMax M2.x are reasoning models and behave like Anthropic in this regard: the Bedrock Converse API returns a ValidationException if both temperature and topP are present in inferenceConfig. I've added an inline comment to _build_inference_config() and updated the docstring to make this explicit. Also added two new unit tests to TestBuildInferenceConfig covering the MiniMax topP omission behaviour.

3. Tool support

MiniMax M2.5 on Bedrock does support tool calling, but only via the bedrock-mantle endpoint (OpenAI-compatible Chat Completions API). This class uses bedrock-runtime (Converse API), which does not support tool use for MiniMax per the AWS model card. So leaving MiniMax out of supports_tools is correct for this code path. I've added a comment on the supports_tools line explaining the limitation.

All changes are in the latest commit (f3be01c). 48 tests passing.

@whysosaket whysosaket merged commit 5d30af9 into mem0ai:main Mar 30, 2026
6 of 7 checks passed
rainfd pushed a commit to rainfd/mem0 that referenced this pull request Apr 8, 2026
wuhonglei pushed a commit to wuhonglei/mem0 that referenced this pull request Apr 19, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants