Fix tool calling detection and support for Ollama models#3566
Merged
lramos15 merged 2 commits intomicrosoft:mainfrom Feb 11, 2026
Merged
Fix tool calling detection and support for Ollama models#3566lramos15 merged 2 commits intomicrosoft:mainfrom
lramos15 merged 2 commits intomicrosoft:mainfrom
Conversation
|
Facing the same issue. Is there a work around i can use till the fix is propagated to release? |
Contributor
Author
lramos15
approved these changes
Feb 10, 2026
Tyriar
approved these changes
Feb 10, 2026
|
Tested PR in VS Code Insiders. Same as @rajasrijan, I can see the models but none of them work.
|
Contributor
Author
|
The 404 issue was not caused by this PR, so I submitted a new one that should resolve it. |
|
Should this be fixed? I'm running v0.37.8 with vscode 1.109.5 and I'm still experiencing this issue EDIT: nevermind, sorry. I have manually removed extension files from .vscode/extensions and redownloaded it. It still returns 404, but models are visible. |
|
Is it fixed? In 0.37.9 I'm still experiencing the issue. Even deleted and redownloaded the extension. |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.













Summary
This PR fixes issues where Ollama models could not be used in Agent mode because tool calling capabilities were not correctly identified or propagated.
Environment
Ollama
mistral-small3.2:latest(supports vision and tool calling)VSCode
Changes
knownModelsinstead ofmodelstobyokKnownModelsToAPIInfo_knownModelsinOllamaLMProvidersupportsToolCallsis correctly derived for Ollama modelsOllama models cannot be used in Agent mode
Issue 1: Ollama models are not correctly recognized as supporting tool calls
Ollama models are not correctly identified as supporting the "tools" capability.
This happens because
getAllModelspassesmodelstobyokKnownModelsToAPIInfo.However, based on the expected input of
byokKnownModelsToAPIInfo,knownModelsshould be passed instead. After applying this change, the model information is correctly populated.Issue 2: The
toolsfield is not sent to the Ollama modelWhen invoking an Ollama model, the
toolssection is not included in the request.This happens because the following code checks
supportsToolCalls. If it is false, thetoolsfield is removed.The value of
supportsToolCallsultimately comes from this code.knownModelInfois passed from an upper layer, which originates fromthis._knownModels.However,
_knownModelsis never initialized inOllamaLMProvider. As a result, it falls back to the base classAbstractOpenAICompatibleLMProvider.Meanwhile,
OllamaLMProviderpassesundefinedtosuper, which causesthis._knownModelsto remainundefined._knownModelsshould be initialized duringgetAllModelsexecution to ensure tool capability is propagated correctly.