Conversation
Coverage reportClick to see where and how coverage changed
This report was generated by python-coverage-comment-action |
||||||||||||||||||||||||
a0b6c7d to
ce65293
Compare
d990079 to
efdab3b
Compare
all-hands-bot
left a comment
There was a problem hiding this comment.
Code Review Summary
This PR has critical issues that must be addressed before merging.
🔴 Critical Issues
-
Undisclosed Breaking Changes: The PR description claims to only add MiniMax-M2.5 support, but the changes actually remove dozens of existing models including:
- OpenAI models: o3, o4-mini, gpt-4o, gpt-4o-mini, gpt-4.1, codex-mini-latest
- Anthropic models: claude-3-5-sonnet-20240620, claude-3-5-sonnet-20241022, claude-3-7-sonnet, claude-opus-4-20250514, etc.
- Gemini models: gemini-2.5-pro
- Mistral models: devstral-small-2505, devstral-small-2507
This is a major breaking change that will prevent users from selecting these models in the GUI.
-
Inconsistent Naming Convention: The new model is named
MiniMax-M2.5(PascalCase with capital M), but all other models in the codebase use lowercase (e.g.,gpt-5.2,claude-opus-4-5-20251101,gemini-3-pro-preview). This inconsistency needs verification with the LiteLLM API.
🟠 Important Issues
- Frontend-Backend Mismatch: Some models in
VERIFIED_MODELS(e.g.,deepseek-chat) don't have corresponding entries in the backendopenhands_modelslist, which will cause issues.
Required Actions
Option A: Update this PR to ONLY add MiniMax-M2.5 without removing other models
Option B: Update the PR description to clearly explain why all these models are being removed and get explicit approval for this breaking change
Please also:
- Verify the correct casing for MiniMax-M2.5 with the LiteLLM provider
- Ensure frontend and backend model lists are consistent
- Add tests to verify the new model works
| 'openhands/gpt-5-2025-08-07', | ||
| 'openhands/gpt-5-mini-2025-08-07', | ||
| 'openhands/claude-opus-4-20250514', | ||
| 'openhands/claude-opus-4-5-20251101', |
There was a problem hiding this comment.
🟠 Important: Same casing issue here - MiniMax-M2.5 uses PascalCase with capital "M", which is inconsistent with other models in the list that use lowercase.
| 'openhands/claude-opus-4-5-20251101', | |
| 'openhands/minimax-m2.5', |
Verify the correct casing with the LiteLLM provider configuration.
There was a problem hiding this comment.
Verified: The LLM proxy at llm-proxy.app.all-hands.dev returns the model as MiniMax-M2.5 (with capital M's). This capitalization matches the actual API response.
Add MiniMax-M2.5 and update model names to match LLM proxy: - openhands/utils/llm.py: openhands_models list for backend API - frontend/src/utils/verified-models.ts: All verified model lists Featured models (matching LLM proxy at llm-proxy.app.all-hands.dev): 1. claude-opus-4-5-20251101 2. claude-sonnet-4-5-20250929 3. gpt-5.2-codex 4. gpt-5.2 5. MiniMax-M2.5 6. gemini-3-pro-preview 7. gemini-3-flash-preview 8. deepseek-chat 9. devstral-medium-2512 10. kimi-k2-0711-preview 11. qwen3-coder-480b Changes: - Added MiniMax-M2.5 (capitalization matches LLM proxy response) - Removed older model versions from VERIFIED_MODELS (user requested cleanup) - Kept older models in VERIFIED_OPENAI_MODELS/VERIFIED_ANTHROPIC_MODELS for LiteLLM provider detection compatibility Co-authored-by: openhands <openhands@all-hands.dev>
efdab3b to
28a9a41
Compare
Co-authored-by: openhands <openhands@all-hands.dev>
Summary
Add MiniMax-M2.5 model support to the OpenHands GUI.
Changes
openhands/utils/llm.py: Addedopenhands/minimax-m2.5to theopenhands_modelslist so it appears in the backend APIfrontend/src/utils/verified-models.ts: Addedminimax-m2.5to bothVERIFIED_MODELSandVERIFIED_OPENHANDS_MODELSlistsThis enables MiniMax-M2.5 to appear in the model selector dropdown in the GUI.
@neubig can click here to continue refining the PR
To run this PR locally, use the following command:
GUI with Docker: