Skip to content

[Feature]: Add OpenAI-compatible LLM provider support in OASM Assistant settings #246

@l1ttps

Description

@l1ttps

Description:

Feature Description

Add functionality to connect with LLM models via an OpenAI-compatible API directly within the OASM Assistant settings. This allows users to configure custom Large Language Models (such as Local LLMs, Ollama, or other third-party providers) by providing the following parameters:

  • API URL: The endpoint address for the LLM service.
  • API Key: The authentication token required to access the API.
  • Model Name: The specific model identifier to be used for requests.

Problem or Need

Currently, the OASM Assistant lacks a flexible way for users to define their own AI backend. Users need the ability to integrate with diverse AI infrastructures to ensure data privacy (via self-hosted models), optimize costs, or leverage specific models tailored for security analysis. Providing an OpenAI-compatible interface is the industry standard for such integrations.

Proposed Solution

In the OASM Assistant settings section, implement a configuration interface that includes:

  1. Input fields for API URL, API Key, and Model Name.
  2. A mechanism to save these credentials securely.
  3. Integration logic within the Assistant to route prompts through the configured OpenAI-compatible provider instead of a hardcoded service.

Alternatives Considered

No response

Additional Context

This feature focuses specifically on the Assistant's settings to allow end-users to customize their "AI brain" without needing to modify global system or MCP server configurations.

Code of Conduct

  • I agree to follow this project's Code of Conduct

Metadata

Metadata

Assignees

Labels

enhancementNew feature or request

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions