Skip to content

feat: Add OpenAI-Compliant /v1/embeddings Endpoints#13898

Closed
pranitl wants to merge 1 commit intoopen-webui:devfrom
pranitl:feature/openai-embeddings-schema
Closed

feat: Add OpenAI-Compliant /v1/embeddings Endpoints#13898
pranitl wants to merge 1 commit intoopen-webui:devfrom
pranitl:feature/openai-embeddings-schema

Conversation

@pranitl
Copy link

@pranitl pranitl commented May 15, 2025

Pull Request Checklist

Note to first-time contributors: Please open a discussion post in Discussions and describe your changes before submitting a pull request.

Before submitting, make sure you've checked the following:

  • Target branch: Please verify that the pull request targets the dev branch.
  • Description: This PR introduces new OpenAI-compliant /v1/embeddings API endpoints for both OpenAI-compatible services (via /openai/v1/embeddings) and Ollama backends (via /ollama/v1/embeddings). It includes shared Pydantic schemas for consistent request/response structures and a utility for token estimation.
  • Changelog: Ensure a changelog entry following the format of Keep a Changelog is added at the bottom of the PR description.
  • Documentation: API documentation is managed by FastAPI /docs page.
  • Dependencies: No new dependencies added.
  • Testing: I used curl commands to verify new endpoints work as expected without breaking older existing embedding endpoints. See the gist for the various commands.
  • Code review: Have you performed a self-review of your code, addressing any coding standard issues and ensuring adherence to the project's coding standards?
  • Prefix: To clearly categorize this pull request, prefix the pull request title using one of the following:
    • BREAKING CHANGE: Significant changes that may affect compatibility
    • build: Changes that affect the build system or external dependencies
    • ci: Changes to our continuous integration processes or workflows
    • chore: Refactor, cleanup, or other non-functional code changes
    • docs: Documentation update or addition
    • feat: Introduces a new feature or enhancement to the codebase
    • fix: Bug fix or error correction
    • i18n: Internationalization or localization changes
    • perf: Performance improvement
    • refactor: Code restructuring for better maintainability, readability, or scalability
    • style: Changes that do not affect the meaning of the code (white space, formatting, missing semi-colons, etc.)
    • test: Adding missing tests or correcting existing tests
    • WIP: Work in progress, a temporary label for incomplete or ongoing work

Changelog Entry

Description

This pull request introduces standardized, OpenAI-compliant embedding generation capabilities through new API endpoints. It adds POST /openai/v1/embeddings to interface with various configured OpenAI-compatible services and POST /ollama/v1/embeddings to provide an OpenAI-compliant interface for Ollama's embedding models. This enhancement aims to provide a consistent API for embedding generation, regardless of the backend service.

The implementation includes:

  • Shared Pydantic models (OpenAIEmbeddingsRequest, OpenAIEmbeddingsResponse, etc.) in backend/open_webui/models/openai_schemas.py for consistent OpenAI-compliant request/response handling across the new endpoints.
  • A token estimation utility (estimate_embedding_tokens) in backend/open_webui/utils/misc.py for use by the Ollama endpoint to populate usage statistics.
  • The /ollama/v1/embeddings endpoint performs translation of OpenAI-formatted requests to Ollama's native embedding request format and translates Ollama's responses back to the OpenAI schema.
  • The /openai/v1/embeddings endpoint routes requests to the appropriate configured OpenAI-compatible backend (e.g., official OpenAI, Azure OpenAI, other generic providers).

Added

  • New API endpoint POST /openai/v1/embeddings in backend/open_webui/routers/openai.py for generating embeddings using configured OpenAI-compatible services.
  • New API endpoint POST /ollama/v1/embeddings in backend/open_webui/routers/ollama.py for generating embeddings via Ollama, using an OpenAI-compliant interface.
  • New shared Pydantic models for OpenAI embedding schemas (OpenAIEmbeddingsRequest, OpenAIEmbeddingData, OpenAIUsage, OpenAIEmbeddingsResponse) in backend/open_webui/models/openai_schemas.py.
  • New utility function estimate_embedding_tokens in backend/open_webui/utils/misc.py to help estimate token counts for embedding inputs, primarily for Ollama.
  • New helper function _generate_ollama_embeddings_native (or similar) within backend/open_webui/routers/ollama.py to centralize Ollama's native embedding call logic.)*

Changed

  • Refactored existing Ollama native embedding logic within backend/open_webui/routers/ollama.py to utilize a new internal helper function. This improves code structure and reusability for calls to Ollama's /api/embed or /api/embeddings.)*
    • Utilizes existing model configuration mechanisms to fetch API keys, base URLs, and other parameters for OpenAI-compatible embedding models.

Deprecated

  • No functionality is deprecated by this PR. (Note: /ollama/api/embed is already marked as deprecated in the codebase prior to this work).

Removed

  • No functionality or files are removed by this PR.

Fixed

  • N/A (This PR is primarily a new feature).

Security

  • New endpoints adhere to existing authentication and authorization mechanisms within Open WebUI. User context is utilized for requests.

Breaking Changes

  • No breaking changes are introduced. Existing Ollama embedding endpoints (/ollama/api/embed and /ollama/api/embeddings) remain functional and unaffected.

Additional Information

  • [Insert any additional context, notes, or explanations for the changes]
    • [Reference any related issues, commits, or other relevant information]

Screenshots or Videos

  • N/A (Backend API changes). curl commands for testing have been successfully executed against a local development instance.

Contributor License Agreement

By submitting this pull request, I confirm that I have read and fully agree to the Contributor License Agreement (CLA), and I am providing my contributions under its terms.

@tjbck tjbck changed the base branch from main to dev May 15, 2025 09:06

from open_webui.config import (
LICENSE_KEY,
# Ollama
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Don't remove these comments

CODE_INTERPRETER_JUPYTER_AUTH_TOKEN,
CODE_INTERPRETER_JUPYTER_AUTH_PASSWORD,
CODE_INTERPRETER_JUPYTER_TIMEOUT,
# Image
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Same here, and all over this file

f"Frontend build directory not found at '{FRONTEND_BUILD_DIR}'. Serving API only."
)

# New centralized, versioned dispatcher for embeddings
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This comment is not needed

user: UserModel = Depends(get_verified_user)
):
model_id_from_request = request_data.model
log = logging.getLogger(__name__) # Get logger instance
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why are we getting a logger within a handler? Use the existing logger

from starlette.background import BackgroundTask

from open_webui.models.models import Models
from open_webui.models.embeddings import OpenAIEmbeddingResponse
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This file has a line where "embeddings" is filtered out from the /openai handler.

With this PR that filter should be removed or handled differently?

@tjbck tjbck marked this pull request as draft May 16, 2025 13:24
@Ithanil
Copy link
Contributor

Ithanil commented Jun 4, 2025

Any chance this can be picked back up? Or is there any other way to expose embedding models from an OpenAI-compatible backend (e.g. via /api/embed)?

EDIT: Just half an hour after this comment #14667 popped up, which looks very good.

@tjbck
Copy link
Contributor

tjbck commented Jun 4, 2025

Closing in favour of #14667

@tjbck tjbck closed this Jun 4, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants