Bug type
Behavior bug (incorrect output/state without crash)
Summary
MiniMax model (e.g., MiniMax-M2.5-highspeed) does not stream output in TUI. Response appears all at once instead
of token-by-token.
Steps to reproduce
- Use MiniMax model with OpenClaw
- Send a message in TUI
- Observe response appears all at once, not streaming
Expected behavior
Response should stream token-by-token (like DeepSeek does)
Actual behavior
Response displays completely after completion
OpenClaw version
2026.3.12
Operating system
Ubuntu24.04
Install method
No response
Model
minimax/MiniMax-M2.5-highspeed
Provider / routing chain
openclaw->minimax
Config file / key location
No response
Additional provider/model setup details
No response
Logs, screenshots, and evidence
Impact and severity
No response
Additional information
Notes
- DeepSeek model works fine with streaming
- curl test to MiniMax API supports streaming
Bug type
Behavior bug (incorrect output/state without crash)
Summary
MiniMax model (e.g., MiniMax-M2.5-highspeed) does not stream output in TUI. Response appears all at once instead
of token-by-token.
Steps to reproduce
Expected behavior
Response should stream token-by-token (like DeepSeek does)
Actual behavior
Response displays completely after completion
OpenClaw version
2026.3.12
Operating system
Ubuntu24.04
Install method
No response
Model
minimax/MiniMax-M2.5-highspeed
Provider / routing chain
openclaw->minimax
Config file / key location
No response
Additional provider/model setup details
No response
Logs, screenshots, and evidence
Impact and severity
No response
Additional information
Notes