Bug type
Regression (worked before, now fails)
Beta release blocker
No
Summary
Attempting to use ollama/deepseek-v4-pro:cloud with /think max results in the following response:
Thinking level "max" is not supported for ollama/deepseek-v4-pro:cloud. Use one of: off, minimal, low, medium, high.
I see from #71584 that this was fixed, so I am calling this a regression. I have no evidence that the fix actually worked since I skipped a large chunk of updates.
After quite a bit of debugging, my agent and I have determined that the ollama plugin is never "activated" and never registers its profile, resulting in resolveActiveThinkingProvider() unable to find the ollama profile.
Workaround
set activation.startup = true in extensions/ollama/openclaw.plugin.json
This allows /think max to work as expected.
Steps to reproduce
- Start a local
ollama instance.
- Authenticate the local
olllama instance.
ollama pull deepseek-v4-pro
- Start OpenClaw v2026.5.3-1 with an OLLAMA_API_KEY
- With the main agent,
/think max
Expected behavior
Gateway responds with Thinking level set to max.
Actual behavior
Gateway response with Thinking level "max" is not supported for ollama/deepseek-v4-pro:cloud...
OpenClaw version
v2026.5.3-1
Operating system
Fedora 43
Install method
built from source with pnpm -- installed via pnpm link -g .
Model
deepseek-v4-pro
Provider / routing chain
openclaw->local_ollama->ollama_cloud
Additional provider/model setup details
No response
Logs, screenshots, and evidence
Added debugging output during the resolution chain:
May 04 17:18:37 emerald node[89709]: [THINKING-DEBUG] resolveThinkingProfile called provider=ollama model=deepseek-v4-pro:cloud normalizedProvider=ollama catalogReasoning=undefined
May 04 17:18:37 emerald node[89709]: [THINKING-DEBUG] looking for provider: ollama registry has 0 providers: []
May 04 17:18:37 emerald node[89709]: [THINKING-DEBUG] found entry: false undefined has resolveThinkingProfile: false
May 04 17:18:37 emerald node[89709]: [THINKING-DEBUG] pluginProfile=false levels=undefined
May 04 17:18:37 emerald node[89709]: [THINKING-DEBUG] looking for provider: ollama registry has 0 providers: []
May 04 17:18:37 emerald node[89709]: [THINKING-DEBUG] found entry: false undefined has resolveThinkingProfile: false
May 04 17:18:37 emerald node[89709]: [THINKING-DEBUG] looking for provider: ollama registry has 0 providers: []
May 04 17:18:37 emerald node[89709]: [THINKING-DEBUG] found entry: false undefined has resolveThinkingProfile: false
May 04 17:18:37 emerald node[89709]: [THINKING-DEBUG] binaryDecision=undefined => BASE profile
May 04 17:18:37 emerald node[89709]: [THINKING-DEBUG] result levels=[ 'off', 'minimal', 'low', 'medium', 'high' ]
May 04 17:18:37 emerald node[89709]: [THINKING-DEBUG] looking for provider: ollama registry has 0 providers: []
May 04 17:18:37 emerald node[89709]: [THINKING-DEBUG] found entry: false undefined has resolveThinkingProfile: false
And with `activation.startup = true` in `extensions/ollama/openclaw.plugin.json`:
May 04 17:23:50 emerald node[90413]: [THINKING-DEBUG] resolveThinkingProfile called provider=ollama model=deepseek-v4-pro:cloud normalizedProvider=ollama catalogReasoning=true
May 04 17:23:50 emerald node[90413]: [THINKING-DEBUG] looking for provider: ollama registry has 3 providers: [ 'minimax', 'minimax-portal', 'ollama' ]
May 04 17:23:50 emerald node[90413]: [THINKING-DEBUG] found entry: true ollama has resolveThinkingProfile: true
May 04 17:23:50 emerald node[90413]: [THINKING-DEBUG] pluginProfile=true levels=[ 'off', 'low', 'medium', 'high', 'max' ]
May 04 17:23:50 emerald node[90413]: [THINKING-DEBUG] normalized levels=[ 'off', 'low', 'medium', 'high', 'max' ] count=5
Impact and severity
Affected: all ollama/deepseek-v4-pro users.
Severity: Medium - unable to set important model parameter
Frequency: Always
Consequence: full model performance is unavailable under OpenClaw
Additional information
No response
Bug type
Regression (worked before, now fails)
Beta release blocker
No
Summary
Attempting to use
ollama/deepseek-v4-pro:cloudwith/think maxresults in the following response:I see from #71584 that this was fixed, so I am calling this a regression. I have no evidence that the fix actually worked since I skipped a large chunk of updates.
After quite a bit of debugging, my agent and I have determined that the ollama plugin is never "activated" and never registers its profile, resulting in
resolveActiveThinkingProvider()unable to find theollamaprofile.Workaround
set
activation.startup = trueinextensions/ollama/openclaw.plugin.jsonThis allows
/think maxto work as expected.Steps to reproduce
ollamainstance.olllamainstance.ollama pull deepseek-v4-pro/think maxExpected behavior
Gateway responds with
Thinking level set to max.Actual behavior
Gateway response with
Thinking level "max" is not supported for ollama/deepseek-v4-pro:cloud...OpenClaw version
v2026.5.3-1
Operating system
Fedora 43
Install method
built from source with pnpm -- installed via
pnpm link -g .Model
deepseek-v4-pro
Provider / routing chain
openclaw->local_ollama->ollama_cloud
Additional provider/model setup details
No response
Logs, screenshots, and evidence
Impact and severity
Affected: all
ollama/deepseek-v4-prousers.Severity: Medium - unable to set important model parameter
Frequency: Always
Consequence: full model performance is unavailable under OpenClaw
Additional information
No response