The template currently has this:
|
@*#if (IsOllama) |
|
// Display a new response from the IChatClient, streaming responses |
|
// aren't supported because Ollama will not support both streaming and using Tools |
|
currentResponseCancellation = new(); |
|
var response = await ChatClient.GetResponseAsync(messages, chatOptions, currentResponseCancellation.Token); |
|
currentResponseMessage = response.Message; |
|
ChatMessageItem.NotifyChanged(currentResponseMessage); |
This should no longer be an issue with Microsoft.Extensions.AI.Ollama, as it should have been resolved in December with #5730.
If the problem is that OllamaSharp doesn't currently enable this, we should contribute to fix OllamaSharp, and then fix the template.
Otherwise, we should root cause what the problem is, fix it, and then fix the template.
The template currently has this:
extensions/src/ProjectTemplates/Microsoft.Extensions.AI.Templates/src/ChatWithCustomData/ChatWithCustomData.Web/Components/Pages/Chat/Chat.razor
Lines 64 to 70 in e8ee4d1
This should no longer be an issue with Microsoft.Extensions.AI.Ollama, as it should have been resolved in December with #5730.
If the problem is that OllamaSharp doesn't currently enable this, we should contribute to fix OllamaSharp, and then fix the template.
Otherwise, we should root cause what the problem is, fix it, and then fix the template.