Hi,
it would be great if the /model command would also list all available local models through ollama, which doesn't work currently.
User: /model
Current model: local/llama3.1:8b
price: input $0/Mtok, output $0/Mtok
context: 128000, max output: None
(streaming: True, vision: False)
Available models:
<no_output>
My global config.toml is as follows:
File: /home/<user>/.config/gptme/config.toml
providers = []
[prompt]
about_user = "I am a curious human programmer."
response_preference = "Don't explain basic concepts"
[prompt.project]
activitywatch = "ActivityWatch is a free and open-source automated time-tracker that helps you track how you sp
end your time on your devices."
gptme = "gptme is a CLI to interact with large language models in a Chat-style interface, enabling the assistan
t to execute commands and code on the local machine, letting them assist in all kinds of development and termin
al-based work."
[env]
# Uncomment to use Claude 3.5 Sonnet by default
#MODEL = "anthropic/claude-3-5-sonnet-20240620"
# One of these need to be set
# If none of them are, they will be prompted for on first start
# OPENAI_API_KEY = ""
# ANTHROPIC_API_KEY = ""
# OPENROUTER_API_KEY = ""
# XAI_API_KEY = ""
# GEMINI_API_KEY = ""
# GROQ_API_KEY = ""
# DEEPSEEK_API_KEY = ""
# Uncomment to use with Ollama
MODEL = "local/gpt-oss:latest"
OPENAI_BASE_URL = "http://localhost:11434/v1"
# Uncomment to change tool configuration
TOOL_FORMAT = "tool" # Select the tool format. One of `markdown`, `xml`, `tool`
# TOOL_ALLOWLIST = "save,append,patch,ipython,shell,browser" # Comma separated list of allowed tools
#TOOL_MODULES = "gptme.tools,custom.tools" # List of python comma separated python module path
Thanks in advance!
Hi,
it would be great if the
/modelcommand would also list all available local models through ollama, which doesn't work currently.My global config.toml is as follows:
Thanks in advance!