Skip to content

Don't work in OpenWebUI Ollama conection #37

@vk2r

Description

@vk2r

I've been waiting for someone to do this. It's fantastic. I tried the following and it didn't work for me with OpenWebUI:

olla:
    image: ghcr.io/thushan/olla:${OLLA_VERSION}
    container_name: olla
    ports:
      - 40114:40114
    volumes:
      - ./olla.yaml:/config.yaml
ollama:
    image: ollama/ollama:${OLLAMA_VERSION}
    container_name: ollama
    ports:
      - 11434:11434
    volumes:
      - ollama_data:/root/.ollama

With this configuration:

server:
  host: 0.0.0.0
  port: 40114

proxy:
  engine: "olla"          # or "olla" for high performance
  load_balancer: "priority" # or round-robin, least-connections

discovery:
  endpoints:
    - name: "server"
      url: "http://localhost:11434"
      platform: "ollama"
      priority: 100         # Higher = preferred
      tags:
        models: "hf.co/unsloth/Qwen3-Coder-30B-A3B-Instruct-GGUF:IQ2_XXS"

    - name: "desktop"
      url: "http://other:11434"
      platform: "ollama"
      priority: 50          # Lower priority fallback
      tags:
        models: "hf.co/unsloth/Qwen3-Coder-30B-A3B-Instruct-GGUF:IQ2_XXS"

And in OpenWebUI conection: http://URL:PORT/olla/ollama

And he does not recognize it ...

PS: I have Ollama and Olla on the same Docker server.

Metadata

Metadata

Assignees

Labels

bugSomething isn't workingdocumentationImprovements or additions to documentation

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions