-
Notifications
You must be signed in to change notification settings - Fork 3
Closed
Labels
good first issueGood for newcomersGood for newcomers
Description
Currently, valid models are hardcoded in the adapter(s) and checked by exit.lua#set_model.
This requires us to release a new version of exit.nvim every time a provider (eg, openai) releases a new model.
One way to fix this would be to check only the validity of the adapter, allow any model, and check in the API response if an invalid model was used.
Example:
0p3n4i:gpt-3.5-turboshould error and list the valid adapters (eg,openai,ollama)openai:gp1-3.5-1urb0should be allowed, but the request would fail with something like "model 'openai:gp1-3.5-1urb0' is not valid, please select a valid model"; a list of valid models could be returned or a URL to the docs where to find themopenai:gpt-3.5-turbowould work as usual
Another solution would be to contact an API endpoint when setting the model to check what models are available for that adapter.
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
good first issueGood for newcomersGood for newcomers