Skip to content

[Bug]: Cannot change model to Ollama Deepseek-r1:latest #4544

@shivanraptor

Description

@shivanraptor

Summary

What went wrong?
Tried to ask via Chat (original model is Qwen) to change to local Ollama's deepseek-r1:latest, which is already pulled and verified.

Steps to reproduce

  1. Installed Ollama and run ollama pull deepseek-r1:latest
  2. Ask in chat to change the default model to DeepSeek (also tried Llama3.3, same problem)
  3. The model cannot be changed. In CLI, I run openclaw model list, it reports:
% openclaw models list        

🦞 OpenClaw 2026.1.29 (a5b4d22) — One CLI to rule them all, and one more restart because you changed the port.

Model                                      Input      Ctx      Local Auth  Tags
ollama/deepseek-r1:latest                  -          -        -     -     default,missing
qwen-portal/vision-model                   text+image 125k     no    yes   fallback#1,configured
qwen-portal/coder-model                    text       125k     no    yes   configured,alias:qwen

It said the Ollama Deepseek is missing.

Expected behavior

What did you expect to happen?
Expect the model changed and restart the gateway, then everything is fine.

Actual behavior

What actually happened?
The chat stops responding afterwards.

Environment

  • Clawdbot version: 2026.1.29
  • OS: macOS Tahoe 26.2
  • Install method (pnpm/npx/docker/etc): npm install -g openclaw@latest

Logs or screenshots

Paste relevant logs or add screenshots (redact secrets).

The output of ollama list is:

% ollama list
NAME                  ID              SIZE      MODIFIED       
deepseek-r1:latest    6995872bfe4c    5.2 GB    14 minutes ago   

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions