Skip to content

ollama: Don't override model's default stop tokens#48119

Merged
benbrandt merged 2 commits intozed-industries:mainfrom
littleKitchen:fix/47798-ollama-stop-tokens
Mar 17, 2026
Merged

ollama: Don't override model's default stop tokens#48119
benbrandt merged 2 commits intozed-industries:mainfrom
littleKitchen:fix/47798-ollama-stop-tokens

Conversation

@littleKitchen
Copy link
Copy Markdown
Contributor

@littleKitchen littleKitchen commented Feb 1, 2026

Summary

When no stop tokens are provided, Zed was sending an empty array ("stop": []) to Ollama. This caused Ollama to override the model's default stop tokens (defined in its Modelfile) with nothing, resulting in models like rnj-1:8b generating infinitely with literal stop tokens appearing in the output.

Problem

Models with custom stop tokens in their Modelfile (like <|eot_id|> for rnj-1:8b) would generate forever because:

  1. Agent thread creates request with stop: Vec::new() (empty)
  2. Ollama provider converts this to stop: Some(vec![])
  3. Serializes as "stop": [] in JSON
  4. Ollama interprets this as "override default stop tokens with nothing"
  5. Model generates forever, outputting stop tokens as literal text

Solution

  1. In crates/language_models/src/provider/ollama.rs:

    • Only send stop when explicitly provided
    • When empty, use None so the field is omitted from JSON
  2. In crates/ollama/src/ollama.rs:

    • Add #[serde(skip_serializing_if = "Option::is_none")] to all ChatOptions fields
    • Ensures None values are omitted, not serialized as null

Testing

Added 4 new tests in crates/ollama/src/ollama.rs:

  • test_chat_options_serialization: Verifies None fields are omitted
  • test_chat_request_with_stop_tokens: Verifies stop tokens are serialized when provided
  • test_chat_request_without_stop_tokens_omits_field: Verifies empty stop is omitted

All 11 ollama tests pass, plus 1 language_models ollama test.

Fixes #47798

Release Notes:

  • Fixed Ollama models with custom stop tokens generating infinitely by not overriding model defaults when no stop tokens are specified.

@cla-bot cla-bot bot added the cla-signed The user has signed the Contributor License Agreement label Feb 1, 2026
@SomeoneToIgnore SomeoneToIgnore added the area:ai Related to Agent Panel, Edit Prediction, Copilot, or other AI features label Feb 1, 2026
@maxdeviant maxdeviant changed the title fix(ollama): don't override model's default stop tokens ollama: Don't override model's default stop tokens Feb 1, 2026
@jswetzen
Copy link
Copy Markdown

Aha, that explains the behavior I was seeing! I was excited to try it but just thought it wasn't as good as it had been hyped up to be.

Copy link
Copy Markdown
Member

@benbrandt benbrandt left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Makes sense, good find!

@benbrandt benbrandt enabled auto-merge (squash) February 12, 2026 12:39
@benbrandt benbrandt self-assigned this Feb 12, 2026
@benbrandt
Copy link
Copy Markdown
Member

@jswetzen do you mind rebasing/merging main in to fix the ci issues?

@jswetzen
Copy link
Copy Markdown

I don't have the codebase either forked or checked out, but I bet @littleKitchen would be willing :)

When no stop tokens are provided, Zed was sending an empty array
(`"stop": []`) to Ollama. This caused Ollama to override the model's
default stop tokens (defined in its Modelfile) with nothing, resulting
in models like rnj-1:8b generating infinitely with literal stop tokens
(e.g., `<|eot_id|>`) appearing in the output.

Now when `request.stop` is empty, we set `stop: None` which causes the
field to be omitted from the JSON entirely. This allows Ollama to use
the model's default stop tokens as configured in its Modelfile.

Also added `#[serde(skip_serializing_if = "Option::is_none")]` to all
`ChatOptions` fields to ensure None values are omitted from the JSON,
not serialized as null.

Fixes zed-industries#47798
@benbrandt benbrandt force-pushed the fix/47798-ollama-stop-tokens branch from 6b55e04 to c99ad23 Compare March 17, 2026 10:27
Co-authored-by: Smit Barmase <heysmitbarmase@gmail.com>
@benbrandt benbrandt merged commit a80b019 into zed-industries:main Mar 17, 2026
29 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

area:ai Related to Agent Panel, Edit Prediction, Copilot, or other AI features cla-signed The user has signed the Contributor License Agreement

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Ollama: rnj-1 model generates infinitely with literal stop tokens in output

4 participants