-
-
Notifications
You must be signed in to change notification settings - Fork 52.8k
Closed
Labels
bugSomething isn't workingSomething isn't working
Description
Summary
What went wrong?
I have deployed OpenClaw on Armbian, but it cannot call the DeepSeek-R1 model on the local Ollama framework.
Steps to reproduce
ollama listshow deepsee-r1 model
- this command work well too.
curl http://192.168.1.106:11434/api/generate -d '{ \
"model": "deepseek-r1:14b", \
"prompt":"hello?" \
}'
- OpenClaw successfully using other models, using config blow cannot response
"models": {
"providers": {
"ollama": {
"baseUrl": "http://192.168.1.106:11434/v1",
"apiKey": "ollama",
"api": "openai-completions",
"models": [
{
"id": "deepseek-r1:14b",
"name": "deepseek-r1:14b",
"reasoning": true,
"input": [
"text"
],
"cost": {
"input": 0,
"output": 0,
"cacheRead": 0,
"cacheWrite": 0
},
"contextWindow": 32768,
"maxTokens": 40960
}
]
}
}
},
Expected behavior
What did you expect to happen?
work well
Actual behavior
What actually happened?
not reply
Environment
- Clawdbot version: 2026.1.29
- OS: Armbian OS 25.11.0 bookworm
- Install method (pnpm/npx/docker/etc): npm
Logs or screenshots
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
bugSomething isn't workingSomething isn't working


