Skip to content

Ollama provider does not pass think:false — thinking models return empty content #50702

@appset

Description

@appset

Bug

When using Ollama models with native thinking support (e.g. qwen3.5:9b), OpenClaw does not send think: false in the Ollama API request even when thinking: "off" is configured.

Without think: false, the model puts its response in the thinking field instead of content. OpenClaw drops thinking blocks → user sees empty response.

Version

  • OpenClaw: 2026.3.13
  • Ollama: latest (March 2026)
  • Model: qwen3.5:9b (any thinking-capable model)

Repro

  1. Set thinking: "off" in model params
  2. Send message that triggers tool call
  3. Tool returns result
  4. Model response has content: "" and thinking: "actual answer"
  5. User sees no response

Verified

# Without think:false → empty content
curl ollama { stream:true } → content:"" thinking:"answer"

# With think:false → correct
curl ollama { stream:true, think:false } → content:"answer" thinking:null

Fix

In createOllamaStreamFn(), add think to request body based on thinkingLevel:

const ollamaThink = options?.thinkingLevel && options.thinkingLevel !== "off";
// body.think = !!ollamaThink

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type
    No fields configured for issues without a type.

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions