A command-line tool for managing LLM (Large Language Model) provider configurations.
- Store API keys and base URLs for multiple LLM providers
- Interactive commands for adding, updating, and removing providers
- Validation of provider configurations
- Environment variable setup for easy integration with other tools
- Copy environment variables to clipboard
npm install -g llm-clillm-cli list
# or
llm-cli lsllm-cli addllm-cli update <provider-name>llm-cli remove <provider-name>
# or
llm-cli rm <provider-name>llm-cli check [provider-name]llm-cli autoremovellm-cli set-envllm-cli copy-env <provider-name>Configurations are stored in ~/.llm-cli/config.json in the following format:
{
"Providers": [
{
"name": "openrouter",
"api_base_url": "https://openrouter.ai/api/v1",
"api_key": "sk-or-v1-xxx-example-key-xxx",
"models": [
"google/gemini-flash-1.5",
"anthropic/claude-3.5-sonnet"
],
"transformer": {
"use": ["openrouter"]
}
}
]
}The tool works with any OpenAI-compatible API, including:
- OpenAI
- Azure OpenAI
- OpenRouter
- DeepSeek
- Gemini
- Ollama
- And many more
MIT