-
-
Notifications
You must be signed in to change notification settings - Fork 52.5k
Closed
Labels
bugSomething isn't workingSomething isn't working
Description
Description
When the LLM context limit is exceeded, the raw API error message is being sent to the user through messaging channels (Telegram, etc.) instead of being handled gracefully.
Error seen by user
LLM request rejected: input length and max_tokens exceed context limit: 170636 + 34048 > 200000, decrease input length or max_tokens and try again
Expected behavior
These internal errors should be caught and either:
- Auto-compact and retry (which exists but apparently doesn't always catch these)
- Show a user-friendly message like "Session getting long, refreshing..." or just silently handle it
Environment
- Version: 2026.1.24-3
- Channel: Telegram
- Model: claude-opus-4-5
Notes
The release notes mention "auto-compact on context overflow prompt errors before failing" but this error is still reaching users. Seems like the error is thrown before the retry logic kicks in, or there's a code path that bypasses it.
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
bugSomething isn't workingSomething isn't working