Summary
mitmproxy integration broken - LLM ERROR
Steps to reproduce
- Set up mitmweb reverse proxy for api.minimax.io on port 8080
- Set HTTP_PROXY/HTTPS_PROXY env vars to http://127.0.0.1:8080
- Try to make an LLM request via MiniMax
- Get 'LLM ERROR' in the UI
Expected behavior
Traffic routes through the proxy as before
Actual behavior
LLM ERROR - proxy not being used
Environment
- OpenClaw version: 2026.3.14
- Operating system: macos 26.2 (arm64)
- Model: minimax-portal/MiniMax-M2.5
- Provider / routing chain: minimax-portal/MiniMax-M2.5
Impact
Cannot use LLM with mitmproxy for debugging
Evidence
- Model path: minimax-portal/MiniMax-M2.5
- Provider path: minimax-portal/MiniMax-M2.5
- Provider auth: profiles: ~/.openclaw-dev/agents/main/agent/auth-profiles.json
Summary
mitmproxy integration broken - LLM ERROR
Steps to reproduce
Expected behavior
Traffic routes through the proxy as before
Actual behavior
LLM ERROR - proxy not being used
Environment
Impact
Cannot use LLM with mitmproxy for debugging
Evidence