Skip to content

Conversation

@Gerome-Elassaad
Copy link
Member

How to use local models

  • Install local models on your device before using them then they will show up in the provider dropdown
  • Go to Settings → Local Providers
  • Configure Ollama base URL (e.g., http://127.0.0.1:11434)
  • Enable Ollama
  • Go back to chat and open the provider/model dropdown
  • Ollama models should now appear in the list

@Gerome-Elassaad Gerome-Elassaad self-assigned this Dec 19, 2025
@Gerome-Elassaad Gerome-Elassaad linked an issue Dec 19, 2025 that may be closed by this pull request
@Gerome-Elassaad Gerome-Elassaad added the bug Something isn't working label Dec 19, 2025
@Gerome-Elassaad Gerome-Elassaad merged commit 78a02b4 into main Dec 19, 2025
3 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

bug Something isn't working

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Cannot select a model from ollama

2 participants