Skip to content

Fix chat template's Ollama special-casing #5975

@stephentoub

Description

@stephentoub

The template currently has this:

@*#if (IsOllama)
// Display a new response from the IChatClient, streaming responses
// aren't supported because Ollama will not support both streaming and using Tools
currentResponseCancellation = new();
var response = await ChatClient.GetResponseAsync(messages, chatOptions, currentResponseCancellation.Token);
currentResponseMessage = response.Message;
ChatMessageItem.NotifyChanged(currentResponseMessage);

This should no longer be an issue with Microsoft.Extensions.AI.Ollama, as it should have been resolved in December with #5730.

If the problem is that OllamaSharp doesn't currently enable this, we should contribute to fix OllamaSharp, and then fix the template.

Otherwise, we should root cause what the problem is, fix it, and then fix the template.

Metadata

Metadata

Assignees

Type

No fields configured for Bug.

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions