Skip to content

[Bug]: MiniMax streaming not working - output displays all at once instead of streaming #45882

@xjhseu

Description

@xjhseu

Bug type

Behavior bug (incorrect output/state without crash)

Summary

MiniMax model (e.g., MiniMax-M2.5-highspeed) does not stream output in TUI. Response appears all at once instead
of token-by-token.

Steps to reproduce

  1. Use MiniMax model with OpenClaw
  2. Send a message in TUI
  3. Observe response appears all at once, not streaming

Expected behavior

Response should stream token-by-token (like DeepSeek does)

Actual behavior

Response displays completely after completion

OpenClaw version

2026.3.12

Operating system

Ubuntu24.04

Install method

No response

Model

minimax/MiniMax-M2.5-highspeed

Provider / routing chain

openclaw->minimax

Config file / key location

No response

Additional provider/model setup details

No response

Logs, screenshots, and evidence

Impact and severity

No response

Additional information

Notes

  • DeepSeek model works fine with streaming
  • curl test to MiniMax API supports streaming

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't workingbug:behaviorIncorrect behavior without a crash

    Type

    No type
    No fields configured for issues without a type.

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions