Skip to content

[Bug] Qwen3: Incorrect response field (reasoning_content instead of content) when enable_thinking=false with streaming enabled #5874

@yangs16

Description

@yangs16

Checklist

  • 1. I have searched related issues but cannot get the expected help.
  • 2. The bug has not been fixed in the latest version.
  • 3. Please note that if the bug-related issue you submitted lacks corresponding environment info and a minimal reproducible demo, it will be challenging for us to reproduce and resolve the issue, reducing the likelihood of receiving feedback.
  • 4. If the issue you raised is not a bug but a question, please raise a discussion at https://github.com/sgl-project/sglang/discussions/new/choose Otherwise, it will be closed.
  • 5. Please use English, otherwise it will be closed.

Describe the bug

When enable_thinking=False and stream=True, the API incorrectly returns the response in the reasoning_content field rather than the expected content field.

Reproduction

  • Request

curl http://localhost:8000/v1/chat/completions -H "Content-Type: application/json" -d '{ "model": "qwen3-32b-fp8", "messages": [ {"role": "system", "content": "You are a helpful assistant."}, {"role": "user", "content": "Hello!"} ], "chat_template_kwargs": {"enable_thinking": false}, "stream": true }'

  • Response

`
data: {"id":"a64fca8ef31f4715b3304d10b4dc6c68","object":"chat.completion.chunk","created":1745908654,"model":"qwen3-32b-fp8","choices":[{"index":0,"delta":{"role":"assistant","content":null,"reasoning_content":null,"tool_calls":null},"logprobs":null,"finish_reason":null,"matched_stop":null}],"usage":null}

data: {"id":"a64fca8ef31f4715b3304d10b4dc6c68","object":"chat.completion.chunk","created":1745908654,"model":"qwen3-32b-fp8","choices":[{"index":0,"delta":{"role":null,"content":null,"reasoning_content":"Hello","tool_calls":null},"logprobs":null,"finish_reason":null,"matched_stop":null}],"usage":null}

data: {"id":"a64fca8ef31f4715b3304d10b4dc6c68","object":"chat.completion.chunk","created":1745908654,"model":"qwen3-32b-fp8","choices":[{"index":0,"delta":{"role":null,"content":null,"reasoning_content":"!","tool_calls":null},"logprobs":null,"finish_reason":null,"matched_stop":null}],"usage":null}

data: {"id":"a64fca8ef31f4715b3304d10b4dc6c68","object":"chat.completion.chunk","created":1745908654,"model":"qwen3-32b-fp8","choices":[{"index":0,"delta":{"role":null,"content":null,"reasoning_content":" How","tool_calls":null},"logprobs":null,"finish_reason":null,"matched_stop":null}],"usage":null}

data: {"id":"a64fca8ef31f4715b3304d10b4dc6c68","object":"chat.completion.chunk","created":1745908654,"model":"qwen3-32b-fp8","choices":[{"index":0,"delta":{"role":null,"content":null,"reasoning_content":" can","tool_calls":null},"logprobs":null,"finish_reason":null,"matched_stop":null}],"usage":null}

data: {"id":"a64fca8ef31f4715b3304d10b4dc6c68","object":"chat.completion.chunk","created":1745908654,"model":"qwen3-32b-fp8","choices":[{"index":0,"delta":{"role":null,"content":null,"reasoning_content":" I","tool_calls":null},"logprobs":null,"finish_reason":null,"matched_stop":null}],"usage":null}

data: {"id":"a64fca8ef31f4715b3304d10b4dc6c68","object":"chat.completion.chunk","created":1745908654,"model":"qwen3-32b-fp8","choices":[{"index":0,"delta":{"role":null,"content":null,"reasoning_content":" assist","tool_calls":null},"logprobs":null,"finish_reason":null,"matched_stop":null}],"usage":null}

data: {"id":"a64fca8ef31f4715b3304d10b4dc6c68","object":"chat.completion.chunk","created":1745908654,"model":"qwen3-32b-fp8","choices":[{"index":0,"delta":{"role":null,"content":null,"reasoning_content":" you","tool_calls":null},"logprobs":null,"finish_reason":null,"matched_stop":null}],"usage":null}

data: {"id":"a64fca8ef31f4715b3304d10b4dc6c68","object":"chat.completion.chunk","created":1745908654,"model":"qwen3-32b-fp8","choices":[{"index":0,"delta":{"role":null,"content":null,"reasoning_content":" today","tool_calls":null},"logprobs":null,"finish_reason":null,"matched_stop":null}],"usage":null}

data: {"id":"a64fca8ef31f4715b3304d10b4dc6c68","object":"chat.completion.chunk","created":1745908654,"model":"qwen3-32b-fp8","choices":[{"index":0,"delta":{"role":null,"content":null,"reasoning_content":"?","tool_calls":null},"logprobs":null,"finish_reason":null,"matched_stop":null}],"usage":null}

data: {"id":"a64fca8ef31f4715b3304d10b4dc6c68","object":"chat.completion.chunk","created":1745908654,"model":"qwen3-32b-fp8","choices":[{"index":0,"delta":{"role":null,"content":null,"reasoning_content":" ","tool_calls":null},"logprobs":null,"finish_reason":null,"matched_stop":null}],"usage":null}

data: {"id":"a64fca8ef31f4715b3304d10b4dc6c68","object":"chat.completion.chunk","created":1745908654,"model":"qwen3-32b-fp8","choices":[{"index":0,"delta":{"role":null,"content":null,"reasoning_content":"😊","tool_calls":null},"logprobs":null,"finish_reason":null,"matched_stop":null}],"usage":null}

data: {"id":"a64fca8ef31f4715b3304d10b4dc6c68","object":"chat.completion.chunk","created":1745908654,"model":"qwen3-32b-fp8","choices":[{"index":0,"delta":{"role":null,"content":null,"reasoning_content":null,"tool_calls":null},"logprobs":null,"finish_reason":"stop","matched_stop":null}],"usage":null}

data: [DONE]
`

Environment

Python: 3.10.0 (default, Mar 3 2022, 09:58:08) [GCC 7.5.0]
CUDA available: True
GPU 0,1,2,3: NVIDIA GeForce RTX 4090
GPU 0,1,2,3 Compute Capability: 8.9
CUDA_HOME: /usr/local/cuda
NVCC: Cuda compilation tools, release 12.4, V12.4.99
CUDA Driver Version: 550.54.14
PyTorch: 2.5.1+cu124
sglang: 0.4.6.post1
sgl_kernel: 0.1.0
flashinfer: Module Not Found
triton: 3.1.0
transformers: 4.51.3
torchao: 0.8.0
numpy: 1.26.4
aiohttp: 3.11.12
fastapi: 0.115.8
hf_transfer: 0.1.9
huggingface_hub: 0.30.2
interegular: 0.3.3
modelscope: 1.22.3
orjson: 3.10.15
outlines: 0.1.11
packaging: 24.2
psutil: 6.1.1
pydantic: 2.10.6
multipart: Module Not Found
zmq: Module Not Found
uvicorn: 0.34.0
uvloop: 0.21.0
vllm: 0.7.2
xgrammar: 0.1.11
openai: 1.62.0
tiktoken: 0.8.0
anthropic: 0.45.2
litellm: 1.61.1
decord: 0.6.0
NVIDIA Topology:
GPU0 GPU1 GPU2 GPU3 CPU Affinity NUMA Affinity GPU NUMA ID
GPU0 X PIX PIX PIX 0-23,48-71 0 N/A
GPU1 PIX X PIX PIX 0-23,48-71 0 N/A
GPU2 PIX PIX X PIX 0-23,48-71 0 N/A
GPU3 PIX PIX PIX X 0-23,48-71 0 N/A

Legend:

X = Self
SYS = Connection traversing PCIe as well as the SMP interconnect between NUMA nodes (e.g., QPI/UPI)
NODE = Connection traversing PCIe as well as the interconnect between PCIe Host Bridges within a NUMA node
PHB = Connection traversing PCIe as well as a PCIe Host Bridge (typically the CPU)
PXB = Connection traversing multiple PCIe bridges (without traversing the PCIe Host Bridge)
PIX = Connection traversing at most a single PCIe bridge
NV# = Connection traversing a bonded set of # NVLinks

ulimit soft: 655360

Metadata

Metadata

Assignees

Labels

No labels
No labels

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions