Skip to content

issue: When using Ollama-compatible LLM provider which does not use ":latest" in its model names no chat is possible #21331

@daniel-georg369

Description

@daniel-georg369

Check Existing Issues

  • I have searched for any existing and/or related issues.
  • I have searched for any existing and/or related discussions.
  • I have also searched in the CLOSED issues AND CLOSED discussions and found no related items (your issue might already be addressed on the development branch!).
  • I am using the latest version of Open WebUI.

Installation Method

Docker

Open WebUI Version

0.7.2

Ollama Version (if applicable)

no ollame

Operating System

Windows

Browser (if applicable)

No response

Confirmation

  • I have read and followed all instructions in README.md.
  • I am using the latest version of both Open WebUI and Ollama.
  • I have included the browser console logs.
  • I have included the Docker container logs.
  • I have provided every relevant configuration, setting, and environment variable used in my setup.
  • I have clearly listed every relevant configuration, custom setting, environment variable, and command-line option that influences my setup (such as Docker Compose overrides, .env values, browser settings, authentication configurations, etc).
  • I have documented step-by-step reproduction instructions that are precise, sequential, and leave nothing to interpretation. My steps:
  • Start with the initial platform/version/OS and dependencies used,
  • Specify exact install/launch/configure commands,
  • List URLs visited, user input (incl. example values/emails/passwords if needed),
  • Describe all options and toggles enabled or changed,
  • Include any files or environmental changes,
  • Identify the expected and actual result at each stage,
  • Ensure any reasonably skilled user can follow and hit the same issue.

Expected Behavior

  • An Ollama-compatible LLM provider is added
  • the models are available to be choosen for a chat
  • the chat works

Actual Behavior

  • An Ollama-compatible LLM provider is added
  • the models are available to be choosen for a chat
  • the chat does not work, instead the error "400: Model 'azure-o4-mini:latest' was not found".

Steps to Reproduce

  • Install Windows
  • Install Docker
  • Pull and then run the Open Webui container with debug logging
  • Add Ollama as LLM provider - available models can be chosen for a chat and work
  • Add a custom LLM provider (which does not use ":xxx" like ":latest" or ":14b" in the model names - available models can be chosen for a chat but don't work

Logs & Screenshots

2026-02-12 15:16:02.437 | DEBUG | open_webui.utils.middleware:process_chat_payload:1641 - tool_ids=None

2026-02-12 15:16:02.437 | DEBUG | open_webui.utils.middleware:process_chat_payload:1642 - direct_tool_servers=[]

2026-02-12 15:16:02.438 | DEBUG | open_webui.utils.chat:generate_chat_completion:171 - generate_chat_completion: {'stream': True, 'model': 'azure-o4-mini', 'messages': [{'role': 'user', 'content': "What are 5 creative things I could do with my kids' art? I don't want to throw them away, but it's also so much clutter."}], 'metadata': {'user_id': '50f9c58c-88a6-41df-ad7d-41585fb4a24e', 'chat_id': '01ab8261-b7ee-48cc-91fd-1976321df6f8', 'message_id': '5c397ddc-4c87-4e5c-a936-6fd35fb31b6a', 'parent_message': {'id': 'c0c7598d-41af-4854-9456-887802f0ab2f', 'parentId': None, 'childrenIds': ['5c397ddc-4c87-4e5c-a936-6fd35fb31b6a'], 'role': 'user', 'content': "What are 5 creative things I could do with my kids' art? I don't want to throw them away, but it's also so much clutter.", 'timestamp': 1770909362, 'models': ['azure-o4-mini']}, 'parent_message_id': 'c0c7598d-41af-4854-9456-887802f0ab2f', 'session_id': 'VTo6LB43IhJiEhw5AAAD', 'filter_ids': [], 'tool_ids': None, 'tool_servers': [], 'files': None, 'features': {'voice': False, 'image_generation': False, 'code_interpreter': False, 'web_search': False}, 'variables': {'{{USER_NAME}}': 'My User', '{{USER_LOCATION}}': 'Unknown', '{{CURRENT_DATETIME}}': '2026-02-12 16:16:02', '{{CURRENT_DATE}}': '2026-02-12', '{{CURRENT_TIME}}': '16:16:02', '{{CURRENT_WEEKDAY}}': 'Thursday', '{{CURRENT_TIMEZONE}}': 'Europe/Berlin', '{{USER_LANGUAGE}}': 'en-US'}, 'model': {'id': 'azure-o4-mini', 'name': 'azure-o4-mini', 'object': 'model', 'created': 1770909353, 'owned_by': 'ollama', 'ollama': {'name': 'azure-o4-mini', 'model': 'azure-o4-mini', 'connection_type': 'external', 'urls': [2]}, 'connection_type': 'external', 'tags': [], 'actions': [], 'filters': []}, 'direct': False, 'params': {'stream_delta_chunk_size': None, 'reasoning_tags': None, 'function_calling': 'default'}}, 'options': {}}

2026-02-12 15:16:02.440 | DEBUG | open_webui.main:process_chat:1746 - Error processing chat payload: 400: Model 'azure-o4-mini:latest' was not found

Image

Additional Information

The model names delivered by the LLM provider are without ":latest" or ":14b", there is no ":" included in the name.

In main/backend/open_webui/routers/ollama.py there are several code snippets with:
if ":" not in model:
model = f"{model}:latest"

So it seems Open Webui is always adding a ":latest" without it being necessary.
The same LLM provider works without any problems in N8N using their Ollama tool.

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions