Feature Request
Description
Please add support for MiniMax as an LLM provider in Mem0.
Motivation
MiniMax is a Chinese AI company offering high-quality language models with competitive pricing. Many users in China would benefit from using MiniMax with Mem0 for enhanced memory capabilities.
Suggested Implementation
MiniMax provides an Anthropic-compatible API, so it could potentially be added through the Litellm provider or as a direct integration.
Additional Context
Example Use Case
config = {
"llm": {
"provider": "minimax",
"config": {
"model": "abab6.5s-chat",
"api_key": "your-minimax-api-key"
}
}
}
m = Memory.from_config(config)
Thank you for considering this feature request!
Feature Request
Description
Please add support for MiniMax as an LLM provider in Mem0.
Motivation
MiniMax is a Chinese AI company offering high-quality language models with competitive pricing. Many users in China would benefit from using MiniMax with Mem0 for enhanced memory capabilities.
Suggested Implementation
MiniMax provides an Anthropic-compatible API, so it could potentially be added through the Litellm provider or as a direct integration.
Additional Context
Example Use Case
Thank you for considering this feature request!