Bug Description
The OpenAI Chat Completions streaming parser (StreamingDelta in rig-core/src/providers/openai/completion/streaming.rs) does not include a reasoning_content field. This causes reasoning/thinking content from OpenAI-compatible providers to be silently dropped during streaming.
RawStreamingChoice::ReasoningDelta was added in PR #1395, and PR #1396 implemented reasoning support for 5 specialized providers (OpenAI Responses API, Anthropic, Gemini, xAI, OpenRouter). However, the generic OpenAI Chat Completions streaming parser — used by send_compatible_streaming_request — was not updated.
The DeepSeek provider has its own StreamingDelta with reasoning_content (line 722 of deepseek.rs), so it works correctly. But any provider using the generic OpenAI-compatible path loses reasoning content.
Affected Providers
All models that send reasoning_content via the standard Chat Completions streaming format:
- GLM-4.7 (Zhipu AI) — verified, 193 reasoning deltas silently dropped
- DeepSeek models via OpenAI-compatible endpoints (not the dedicated provider)
- Qwen with thinking mode (Alibaba DashScope)
- vLLM / Ollama serving reasoning models via OpenAI-compatible API
- Any other OpenAI-compatible endpoint returning
delta.reasoning_content
Root Cause
StreamingDelta (line 35) only has content and tool_calls:
ust struct StreamingDelta { #[serde(default)] content: Option<String>, #[serde(default, deserialize_with = "json_utils::null_or_vec")] tool_calls: Vec<StreamingToolCall>, }
The reasoning_content field from SSE chunks like:
json {"choices":[{"delta":{"reasoning_content":"Let me think...","content":null}}]}
is silently ignored by serde (#[serde(default)] not present → field not parsed).
Suggested Fix
Two changes in rig-core/src/providers/openai/completion/streaming.rs:
-
Add reasoning_content to StreamingDelta:
ust struct StreamingDelta { #[serde(default)] content: Option<String>, #[serde(default)] reasoning_content: Option<String>, // ← add this #[serde(default, deserialize_with = "json_utils::null_or_vec")] tool_calls: Vec<StreamingToolCall>, }
-
Yield ReasoningDelta in the stream (before the text content block):
ust if let Some(reasoning) = &delta.reasoning_content && !reasoning.is_empty() { yield Ok(streaming::RawStreamingChoice::ReasoningDelta { id: None, reasoning: reasoning.clone(), }); }
This matches the existing pattern in the DeepSeek provider (deepseek.rs line 840).
Reproduction
- Configure an OpenAI-compatible provider with a reasoning model (e.g. GLM-4.7 via
https://open.bigmodel.cn/api/coding/paas/v4)
- Use
model.stream() or agent.stream_chat()
- Observe: no
ReasoningDelta / Reasoning events in the stream
- The model's thinking content is silently discarded
Environment
- rig-core: 0.31.0 (latest main)
- Tested with: GLM-4.7 (Zhipu AI OpenAI-compatible API)
Bug Description
The OpenAI Chat Completions streaming parser (
StreamingDeltainrig-core/src/providers/openai/completion/streaming.rs) does not include areasoning_contentfield. This causes reasoning/thinking content from OpenAI-compatible providers to be silently dropped during streaming.RawStreamingChoice::ReasoningDeltawas added in PR #1395, and PR #1396 implemented reasoning support for 5 specialized providers (OpenAI Responses API, Anthropic, Gemini, xAI, OpenRouter). However, the generic OpenAI Chat Completions streaming parser — used bysend_compatible_streaming_request— was not updated.The DeepSeek provider has its own
StreamingDeltawithreasoning_content(line 722 ofdeepseek.rs), so it works correctly. But any provider using the generic OpenAI-compatible path loses reasoning content.Affected Providers
All models that send
reasoning_contentvia the standard Chat Completions streaming format:delta.reasoning_contentRoot Cause
StreamingDelta(line 35) only hascontentandtool_calls:ust struct StreamingDelta { #[serde(default)] content: Option<String>, #[serde(default, deserialize_with = "json_utils::null_or_vec")] tool_calls: Vec<StreamingToolCall>, }The
reasoning_contentfield from SSE chunks like:json {"choices":[{"delta":{"reasoning_content":"Let me think...","content":null}}]}is silently ignored by serde (
#[serde(default)]not present → field not parsed).Suggested Fix
Two changes in
rig-core/src/providers/openai/completion/streaming.rs:Add
reasoning_contenttoStreamingDelta:ust struct StreamingDelta { #[serde(default)] content: Option<String>, #[serde(default)] reasoning_content: Option<String>, // ← add this #[serde(default, deserialize_with = "json_utils::null_or_vec")] tool_calls: Vec<StreamingToolCall>, }Yield
ReasoningDeltain the stream (before the text content block):ust if let Some(reasoning) = &delta.reasoning_content && !reasoning.is_empty() { yield Ok(streaming::RawStreamingChoice::ReasoningDelta { id: None, reasoning: reasoning.clone(), }); }This matches the existing pattern in the DeepSeek provider (
deepseek.rsline 840).Reproduction
https://open.bigmodel.cn/api/coding/paas/v4)model.stream()oragent.stream_chat()ReasoningDelta/Reasoningevents in the streamEnvironment