ollama: Don't override model's default stop tokens#48119
Merged
benbrandt merged 2 commits intozed-industries:mainfrom Mar 17, 2026
Merged
ollama: Don't override model's default stop tokens#48119benbrandt merged 2 commits intozed-industries:mainfrom
benbrandt merged 2 commits intozed-industries:mainfrom
Conversation
|
Aha, that explains the behavior I was seeing! I was excited to try it but just thought it wasn't as good as it had been hyped up to be. |
Member
|
@jswetzen do you mind rebasing/merging main in to fix the ci issues? |
|
I don't have the codebase either forked or checked out, but I bet @littleKitchen would be willing :) |
When no stop tokens are provided, Zed was sending an empty array (`"stop": []`) to Ollama. This caused Ollama to override the model's default stop tokens (defined in its Modelfile) with nothing, resulting in models like rnj-1:8b generating infinitely with literal stop tokens (e.g., `<|eot_id|>`) appearing in the output. Now when `request.stop` is empty, we set `stop: None` which causes the field to be omitted from the JSON entirely. This allows Ollama to use the model's default stop tokens as configured in its Modelfile. Also added `#[serde(skip_serializing_if = "Option::is_none")]` to all `ChatOptions` fields to ensure None values are omitted from the JSON, not serialized as null. Fixes zed-industries#47798
6b55e04 to
c99ad23
Compare
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Summary
When no stop tokens are provided, Zed was sending an empty array (
"stop": []) to Ollama. This caused Ollama to override the model's default stop tokens (defined in its Modelfile) with nothing, resulting in models like rnj-1:8b generating infinitely with literal stop tokens appearing in the output.Problem
Models with custom stop tokens in their Modelfile (like
<|eot_id|>for rnj-1:8b) would generate forever because:stop: Vec::new()(empty)stop: Some(vec![])"stop": []in JSONSolution
In
crates/language_models/src/provider/ollama.rs:stopwhen explicitly providedNoneso the field is omitted from JSONIn
crates/ollama/src/ollama.rs:#[serde(skip_serializing_if = "Option::is_none")]to allChatOptionsfieldsNonevalues are omitted, not serialized asnullTesting
Added 4 new tests in
crates/ollama/src/ollama.rs:test_chat_options_serialization: Verifies None fields are omittedtest_chat_request_with_stop_tokens: Verifies stop tokens are serialized when providedtest_chat_request_without_stop_tokens_omits_field: Verifies empty stop is omittedAll 11 ollama tests pass, plus 1 language_models ollama test.
Fixes #47798
Release Notes: