Skip to content

[Feature Request] Support MiniMax LLM Provider #4132

@Ivy-End

Description

@Ivy-End

Feature Request

Description

Please add support for MiniMax as an LLM provider in Mem0.

Motivation

MiniMax is a Chinese AI company offering high-quality language models with competitive pricing. Many users in China would benefit from using MiniMax with Mem0 for enhanced memory capabilities.

Suggested Implementation

MiniMax provides an Anthropic-compatible API, so it could potentially be added through the Litellm provider or as a direct integration.

Additional Context

Example Use Case

config = {
    "llm": {
        "provider": "minimax",
        "config": {
            "model": "abab6.5s-chat",
            "api_key": "your-minimax-api-key"
        }
    }
}
m = Memory.from_config(config)

Thank you for considering this feature request!

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions