Skip to content

Add support for custom HTTP headers in VLM models (OpenAI-compatible)#723

Merged
qin-ctx merged 5 commits intovolcengine:mainfrom
KorenKrita:feat/vlm-extra-headers
Mar 18, 2026
Merged

Add support for custom HTTP headers in VLM models (OpenAI-compatible)#723
qin-ctx merged 5 commits intovolcengine:mainfrom
KorenKrita:feat/vlm-extra-headers

Conversation

@KorenKrita
Copy link
Copy Markdown
Contributor

Description

Add custom HTTP headers support for the VLM (Vision Language Model) OpenAI-compatible backend. Users can pass custom request headers (e.g., HTTP-Referer and X-Title required by OpenRouter) via the extra_headers configuration option.

Related Issue

Type of Change

  • New feature (non-breaking change that adds functionality)
  • Documentation update

Changes Made

  1. Core code changes:

    • openviking/models/vlm/base.py: VLMBase extracts extra_headers from config
    • openviking/models/vlm/backends/openai_vlm.py: OpenAIVLM passes extra_headers as default_headers to OpenAI clients (sync/async)
    • openviking_cli/utils/config/vlm_config.py: VLMConfig supports extra_headers in providers configuration
  2. Tests:

    • Added tests/models/test_vlm_extra_headers.py with 6 test cases covering sync/async clients, empty config, VLMConfig forwarding, etc.
  3. Documentation and config:

    • Updated examples/ov.conf.example with an extra_headers usage example (OpenRouter scenario)
    • Updated docs/zh/guides/01-configuration.md and docs/en/guides/01-configuration.md with extra_headers parameter description and usage examples

Testing

  • I have added tests that prove my fix is effective or that my feature works
  • New and existing unit tests pass locally with my changes
  • I have tested this on the following platforms:
    • Linux
    • macOS
    • Windows

Checklist

  • My code follows the project's coding style
  • I have performed a self-review of my code
  • I have made corresponding changes to the documentation
  • My changes generate no new warnings

Additional Notes

Usage example:

{
  "vlm": {
    "provider": "openai",
    "api_key": "your-api-key",
    "model": "gpt-4o",
    "api_base": "https://openrouter.ai/api/v1",
    "extra_headers": {
      "HTTP-Referer": "https://your-site.com",
      "X-Title": "Your App Name"
    }
  }
}

This feature only takes effect for OpenAI-compatible VLM providers (e.g., openai, litellm, etc.), and does not affect the volcengine provider.

…ends

Add extra_headers configuration option for VLM models to support custom
HTTP headers (e.g., HTTP-Referer, X-Title) when using OpenAI-compatible
providers like OpenRouter.

Changes:
- VLMBase extracts extra_headers from config
- OpenAIVLM passes extra_headers as default_headers to OpenAI client
- VLMConfig supports extra_headers in providers config
- Add tests for extra_headers functionality
- Update configuration docs (zh/en) and example config

Co-Authored-By: KorenKrita <KorenKrita@gmail.com>
@CLAassistant
Copy link
Copy Markdown

CLAassistant commented Mar 18, 2026

CLA assistant check
All committers have signed the CLA.

Copy link
Copy Markdown
Collaborator

@qin-ctx qin-ctx left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Two blocking issues found. The core code changes look clean, but the docs/config need adjustment before merge.

- Add extra_headers: Optional[Dict[str, str]] field to VLMConfig
- Migrate extra_headers to providers structure in _migrate_legacy_config
- Remove confusing example-only keys from ov.conf.example
- Add test for flat extra_headers config style

Co-Authored-By: KorenKrita <KorenKrita@gmail.com>
# Conflicts:
#	openviking/models/vlm/base.py
Format long assertion lines to pass CI checks.

Co-Authored-By: KorenKrita <KorenKrita@gmail.com>
@KorenKrita KorenKrita requested a review from qin-ctx March 18, 2026 04:49
@KorenKrita
Copy link
Copy Markdown
Contributor Author

@qin-ctx cr fix done

@qin-ctx qin-ctx merged commit cd87c0a into volcengine:main Mar 18, 2026
5 of 6 checks passed
@github-project-automation github-project-automation bot moved this from Backlog to Done in OpenViking project Mar 18, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

Status: Done

Development

Successfully merging this pull request may close these issues.

3 participants