Skip to content

language_models: Add auto_discover setting for Ollama#42207

Merged
bennetbo merged 1 commit intozed-industries:mainfrom
xfbs:ollama-autodiscovery-toggle
Dec 16, 2025
Merged

language_models: Add auto_discover setting for Ollama#42207
bennetbo merged 1 commit intozed-industries:mainfrom
xfbs:ollama-autodiscovery-toggle

Conversation

@xfbs
Copy link
Contributor

@xfbs xfbs commented Nov 7, 2025

First up: I'm sorry if this is a low quality PR, or if this feature isn't wanted. I implemented this because I'd like to have this behaviour. If you don't think that this is useful, feel free to close the PR without comment. :)

My idea is this: I love to pull random models with Ollama to try them. At the same time, not all of them are useful for coding, or some won't work out of the box with the context_length set. So, I'd like to change Zed's behaviour to not show me all models Ollama has, but to limit it to the ones that I configure manually.

What I did is add an auto_discover field to the settings. The idea is that you can write a config like this:

"language_models": {
    "ollama": {
      "api_url": "http://localhost:11434",
      "auto_discover": false,
      "available_models": [
        {
          "name": "qwen3:4b",
          "display_name": "Qwen3 4B 32K",
          "max_tokens": 32768,
          "supports_tools": true,
          "supports_thinking": true,
          "supports_images": true
        }
      ]
    }
  }

The auto_discover: false means that Zed won't pick up or show the language models that Ollama knows about, and will only show me the one I manually configured in available_models. That way, I can pull random models with Ollama, but in Zed I can only see the ones that I know work (because I've configured them).

The default for auto_discover (when it is not explicitly set) is true, meaning that the existing behaviour is preserved, and this is not a breaking change for configurations.

Release Notes:

  • ollama: Added auto_discover setting to optionally limit visible models to only those manually configured in available_models

@cla-bot
Copy link

cla-bot bot commented Nov 7, 2025

We require contributors to sign our Contributor License Agreement, and we don't have @xfbs on file. You can sign our CLA at https://zed.dev/cla. Once you've signed, post a comment here that says '@cla-bot check'.

@maxdeviant maxdeviant changed the title language_models: adds 'auto_discover' setting for Ollama language_models: Add auto_discover setting for Ollama Nov 7, 2025
@xfbs
Copy link
Contributor Author

xfbs commented Nov 7, 2025

@cla-bot check

@cla-bot cla-bot bot added the cla-signed The user has signed the Contributor License Agreement label Nov 7, 2025
@cla-bot
Copy link

cla-bot bot commented Nov 7, 2025

The cla-bot has been summoned, and re-checked this pull request!

This adds a setting flag named 'auto_discover' for the Ollama LLM
executor, which disables Zed from automatically picking up all pulled
models that Ollama has, only showing manually listed models.

By default, auto discovery is enabled, preseving the current behaviour.

Release Notes:

- Adds an 'auto_discover' field for the language_models.ollama setting.
@xfbs xfbs force-pushed the ollama-autodiscovery-toggle branch from 3938c96 to a2309c1 Compare November 7, 2025 16:17
@SomeoneToIgnore SomeoneToIgnore added the area:ai Improvement related to Agent Panel, Edit Prediction, Copilot, or other AI features label Nov 7, 2025
@bennetbo bennetbo self-assigned this Nov 10, 2025
@franciskafyi franciskafyi assigned benbrandt and unassigned bennetbo Dec 13, 2025
Copy link
Member

@bennetbo bennetbo left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank you!

@bennetbo bennetbo merged commit ebd5a50 into zed-industries:main Dec 16, 2025
32 checks passed
@github-project-automation github-project-automation bot moved this from Community PRs to Done in Quality Week – December 2025 Dec 16, 2025
CherryWorm pushed a commit to CherryWorm/zed that referenced this pull request Dec 16, 2025
…es#42207)

First up: I'm sorry if this is a low quality PR, or if this feature
isn't wanted. I implemented this because I'd like to have this
behaviour. If you don't think that this is useful, feel free to close
the PR without comment. :)

My idea is this: I love to pull random models with Ollama to try them.
At the same time, not all of them are useful for coding, or some won't
work out of the box with the context_length set. So, I'd like to change
Zed's behaviour to not show me all models Ollama has, but to limit it to
the ones that I configure manually.

What I did is add an `auto_discover` field to the settings. The idea is
that you can write a config like this:

```json
"language_models": {
    "ollama": {
      "api_url": "http://localhost:11434",
      "auto_discover": false,
      "available_models": [
        {
          "name": "qwen3:4b",
          "display_name": "Qwen3 4B 32K",
          "max_tokens": 32768,
          "supports_tools": true,
          "supports_thinking": true,
          "supports_images": true
        }
      ]
    }
  }
```

The `auto_discover: false` means that Zed won't pick up or show the
language models that Ollama knows about, and will only show me the one I
manually configured in `available_models`. That way, I can pull random
models with Ollama, but in Zed I can only see the ones that I know work
(because I've configured them).

The default for `auto_discover` (when it is not explicitly set) is
`true`, meaning that the existing behaviour is preserved, and this is
not a breaking change for configurations.

Release Notes:

- ollama: Added `auto_discover` setting to optionally limit visible
models to only those manually configured in `available_models`
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

area:ai Improvement related to Agent Panel, Edit Prediction, Copilot, or other AI features cla-signed The user has signed the Contributor License Agreement

Projects

Development

Successfully merging this pull request may close these issues.

4 participants