Skip to content

Gateway: honor message-channel header for chat completions#30462

Merged
steipete merged 3 commits intoopenclaw:mainfrom
bmendonca3:bm/openai-chatcompletions-message-channel-r2
Mar 2, 2026
Merged

Gateway: honor message-channel header for chat completions#30462
steipete merged 3 commits intoopenclaw:mainfrom
bmendonca3:bm/openai-chatcompletions-message-channel-r2

Conversation

@bmendonca3
Copy link

Summary

Describe the problem and fix in 2–5 bullets:

  • Problem: POST /v1/chat/completions ignored x-openclaw-message-channel and always invoked agentCommand with messageChannel: "webchat".
  • Why it matters: clients using that endpoint cannot preserve their true origin channel, which breaks channel-aware behavior and delivery context.
  • What changed: the OpenAI-compatible HTTP bridge now reads and normalizes x-openclaw-message-channel, then passes it through to agentCommand instead of hardcoding webchat.
  • What did NOT change (scope boundary): this PR does not change session-key resolution, delivery behavior, /v1/responses, or any other HTTP endpoint.

Change Type (select all)

  • Bug fix
  • Feature
  • Refactor
  • Docs
  • Security hardening
  • Chore/infra

Scope (select all touched areas)

  • Gateway / orchestration
  • Skills / tool execution
  • Auth / tokens
  • Memory / storage
  • Integrations
  • API / contracts
  • UI / DX
  • CI/CD / infra

Linked Issue/PR

User-visible / Behavior Changes

POST /v1/chat/completions now preserves x-openclaw-message-channel when invoking the agent run instead of forcing the origin channel to webchat.

Security Impact (required)

  • New permissions/capabilities? (Yes/No) No
  • Secrets/tokens handling changed? (Yes/No) No
  • New/changed network calls? (Yes/No) No
  • Command/tool execution surface changed? (Yes/No) No
  • Data access scope changed? (Yes/No) No
  • If any Yes, explain risk + mitigation:

Repro + Verification

Environment

  • OS: macOS
  • Runtime/container: Node 22.22.0, pnpm 10.23.0
  • Model/provider: OpenAI-compatible gateway endpoint
  • Integration/channel (if any): POST /v1/chat/completions
  • Relevant config (redacted): token auth, openAiChatCompletionsEnabled: true

Steps

  1. On upstream/main, run pnpm exec vitest run --config /tmp/openclaw-repros/vitest.config.cjs /tmp/openclaw-repros/29449-openai-message-channel.test.ts --reporter=verbose.
  2. Observe the failing assertion: the first agentCommand call receives messageChannel: "webchat" even when the request header sets x-openclaw-message-channel: custom-client-channel.
  3. On this branch, run pnpm exec vitest run src/gateway/openai-http.message-channel.test.ts --reporter=verbose.

Expected

  • /v1/chat/completions should pass the normalized x-openclaw-message-channel header through to agentCommand.

Actual

  • Before this patch, the endpoint hardcoded messageChannel: "webchat".
  • After this patch, the endpoint forwards custom-client-channel into agentCommand.

Evidence

Attach at least one:

  • Failing test/log before + passing after
  • Trace/log snippets
  • Screenshot/recording
  • Perf numbers (if relevant)

Human Verification (required)

What you personally verified (not just CI), and how:

  • Verified scenarios: reproduced the bad header handling with a focused gateway harness; added src/gateway/openai-http.message-channel.test.ts; confirmed both the throwaway repro and the in-repo gateway test now observe messageChannel: "custom-client-channel".
  • Edge cases checked: the new path still normalizes the header through the existing message-channel helper, so built-in and plugin channels follow the same normalization as other gateway surfaces.
  • What you did not verify: downstream delivery behavior after the agent run, because this bug is limited to request-to-agent plumbing.

Verification commands:

  • pnpm exec vitest run src/gateway/openai-http.message-channel.test.ts --reporter=verbose
  • pnpm exec vitest run --config /tmp/openclaw-repros/vitest.config.cjs /tmp/openclaw-repros/29449-openai-message-channel.test.ts --reporter=verbose
  • pnpm check currently stops on the existing unrelated src/gateway/server-cron.ts(197) type error present on current upstream/main; this PR does not touch that file.

Compatibility / Migration

  • Backward compatible? (Yes/No) Yes
  • Config/env changes? (Yes/No) No
  • Migration needed? (Yes/No) No
  • If yes, exact upgrade steps:

Failure Recovery (if this breaks)

  • How to disable/revert this change quickly: revert commit 4bc1b6ddd3.
  • Files/config to restore: src/gateway/openai-http.ts and src/gateway/openai-http.message-channel.test.ts.
  • Known bad symptoms reviewers should watch for: custom channel headers unexpectedly being dropped or normalized incorrectly.

Risks and Mitigations

List only real risks for this PR. Add/remove entries as needed. If none, write None.

  • Risk: callers can now pass arbitrary channel identifiers through this endpoint, which changes behavior for requests that previously always looked like webchat.
    • Mitigation: the value is still normalized by the shared normalizeMessageChannel helper, matching the behavior already used by other gateway HTTP entry points.

@greptile-apps
Copy link
Contributor

greptile-apps bot commented Mar 1, 2026

Greptile Summary

This PR fixes a bug where POST /v1/chat/completions ignored the x-openclaw-message-channel header and always invoked agentCommand with messageChannel: "webchat". The fix reads and normalizes the header via the existing normalizeMessageChannel utility, falling back to "webchat" when absent, and passes the result through to agentCommand.

Changes:

  • src/gateway/openai-http.ts: Imports normalizeMessageChannel and getHeader, adds messageChannel to buildAgentCommandInput's parameter type, resolves the channel from the request header (with "webchat" fallback), and threads it into buildAgentCommandInput.
  • src/gateway/openai-http.message-channel.test.ts: New test file covering the header-present path ("custom-client-channel" → forwarded as-is after normalization) and the backward-compatible header-absent path (defaults to "webchat").

The implementation is minimal, well-scoped, and correctly leverages the shared normalizeMessageChannel helper that all other gateway HTTP entry points already use. No critical issues were found.

Confidence Score: 5/5

  • This PR is safe to merge — the change is minimal, well-tested, and backward-compatible.
  • The fix is a tightly scoped one-liner backed by two targeted tests covering both the new behavior and the backward-compatibility guarantee. It reuses existing, well-tested utilities (normalizeMessageChannel, getHeader) that are already exercised across other gateway entry points. No new permissions, network calls, or token handling are introduced, and the "webchat" fallback preserves the previous default for all existing callers.
  • No files require special attention.

Last reviewed commit: 10ed4ec

Copy link
Contributor

@greptile-apps greptile-apps bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

2 files reviewed, 2 comments

Edit Code Review Agent Settings | Greptile

Comment on lines +231 to +232
const messageChannel =
normalizeMessageChannel(getHeader(req, "x-openclaw-message-channel") ?? "") ?? "webchat";
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Redundant ?? "" coercion

normalizeMessageChannel already accepts string | null | undefined and returns undefined for any falsy input (including null and undefined). The ?? "" is unnecessary since passing null/undefined directly produces the same result as passing "".

Suggested change
const messageChannel =
normalizeMessageChannel(getHeader(req, "x-openclaw-message-channel") ?? "") ?? "webchat";
const messageChannel =
normalizeMessageChannel(getHeader(req, "x-openclaw-message-channel")) ?? "webchat";
Prompt To Fix With AI
This is a comment left during a code review.
Path: src/gateway/openai-http.ts
Line: 231-232

Comment:
**Redundant `?? ""` coercion**

`normalizeMessageChannel` already accepts `string | null | undefined` and returns `undefined` for any falsy input (including `null` and `undefined`). The `?? ""` is unnecessary since passing `null`/`undefined` directly produces the same result as passing `""`.

```suggestion
  const messageChannel =
    normalizeMessageChannel(getHeader(req, "x-openclaw-message-channel")) ?? "webchat";
```

How can I resolve this? If you propose a fix, please make it concise.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Applied on head 10ed4ece1a.

  • Removed redundant ?? "" before normalizeMessageChannel(...) in openai-http.ts.
  • Added regression coverage for the backward-compatible default path:
    defaults messageChannel to webchat when header is absent.

Verification:
pnpm -s vitest run src/gateway/openai-http.message-channel.test.ts

Comment on lines +7 to +42
it("passes x-openclaw-message-channel through to agentCommand", async () => {
agentCommand.mockReset();
agentCommand.mockResolvedValueOnce({ payloads: [{ text: "ok" }] } as never);

await withGatewayServer(
async ({ port }) => {
const res = await fetch(`http://127.0.0.1:${port}/v1/chat/completions`, {
method: "POST",
headers: {
"content-type": "application/json",
authorization: "Bearer secret",
"x-openclaw-message-channel": "custom-client-channel",
},
body: JSON.stringify({
model: "openclaw",
messages: [{ role: "user", content: "hi" }],
}),
});

expect(res.status).toBe(200);
const firstCall = (agentCommand.mock.calls[0] as unknown[] | undefined)?.[0] as
| { messageChannel?: string }
| undefined;
expect(firstCall?.messageChannel).toBe("custom-client-channel");
await res.text();
},
{
serverOptions: {
host: "127.0.0.1",
auth: { mode: "token", token: "secret" },
controlUiEnabled: false,
openAiChatCompletionsEnabled: true,
},
},
);
});
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Missing test for default "webchat" fallback

The test suite covers the header-present case, but there's no assertion that when x-openclaw-message-channel is absent the channel still defaults to "webchat". This is the backward-compatibility guarantee the PR is explicitly preserving, and a regression there would be silent.

Consider adding a second it case that omits the header and asserts firstCall?.messageChannel === "webchat".

Prompt To Fix With AI
This is a comment left during a code review.
Path: src/gateway/openai-http.message-channel.test.ts
Line: 7-42

Comment:
**Missing test for default `"webchat"` fallback**

The test suite covers the header-present case, but there's no assertion that when `x-openclaw-message-channel` is absent the channel still defaults to `"webchat"`. This is the backward-compatibility guarantee the PR is explicitly preserving, and a regression there would be silent.

Consider adding a second `it` case that omits the header and asserts `firstCall?.messageChannel === "webchat"`.

How can I resolve this? If you propose a fix, please make it concise.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Applied on head 10ed4ece1a.

  • Removed redundant ?? "" before normalizeMessageChannel(...) in openai-http.ts.
  • Added regression coverage for the backward-compatible default path:
    defaults messageChannel to webchat when header is absent.

Verification:
pnpm -s vitest run src/gateway/openai-http.message-channel.test.ts

@aisle-research-bot
Copy link

aisle-research-bot bot commented Mar 2, 2026

🔒 Aisle Security Analysis

We found 1 potential security issue(s) in this PR:

# Severity Title
1 🔵 Low Unvalidated x-openclaw-message-channel header allows log/prompt injection via arbitrary messageChannel

1. 🔵 Unvalidated x-openclaw-message-channel header allows log/prompt injection via arbitrary messageChannel

Property Value
Severity Low
CWE CWE-117
Location src/gateway/openai-http.ts:229-233

Description

handleOpenAiHttpRequest now trusts the untrusted x-openclaw-message-channel HTTP header to set messageChannel.

  • normalizeMessageChannel() only does trim().toLowerCase() and otherwise returns arbitrary strings, including unknown/custom values.
  • The resulting messageChannel is propagated into the agent runtime where it is interpolated into log lines (e.g., embedded run start: ... messageChannel=...) and used as a “channel hint” in other places (tool-format selection, prompt text).
  • Because there is no character allowlist / length limit / control-char rejection, a crafted header value (e.g., very long strings, Unicode bidi controls, other non-printable characters) can:
    • forge/mislead logs (CWE-117)
    • cause high-cardinality/garbled operational metadata
    • potentially confuse downstream routing/policy code that assumes channel-like identifiers

Vulnerable code:

const messageChannel =
  normalizeMessageChannel(getHeader(req, "x-openclaw-message-channel")) ?? "webchat";

Recommendation

Treat x-openclaw-message-channel as untrusted and constrain it.

Option A (recommended): enforce a strict, length-bounded identifier format and drop invalid values:

function normalizeSafeMessageChannel(raw?: string | null): string | undefined {
  const v = raw?.trim().toLowerCase();
  if (!v) return undefined;// 1–64 chars, ASCII safe, prevents bidi/control/whitespace surprises.
  if (!/^[a-z0-9][a-z0-9_-]{0,63}$/.test(v)) return undefined;
  return normalizeMessageChannel(v); // or inline the built-in/plugin mapping
}

const messageChannel = normalizeSafeMessageChannel(getHeader(req, "x-openclaw-message-channel")) ?? "webchat";

Option B: allow only known channels (built-in + registered plugin ids), otherwise default.

Additionally, when logging messageChannel, escape/encode it (e.g., JSON.stringify(value) or a dedicated log-sanitizer) to prevent log forging even if unexpected characters slip through.


Analyzed PR: #30462 at commit 9cf5bcd

Last updated on: 2026-03-02T23:58:55Z

@bmendonca3
Copy link
Author

@greptile-apps please re-review latest head for updated confidence score.

@bmendonca3
Copy link
Author

@greptile review latest commit, please.

@steipete steipete force-pushed the bm/openai-chatcompletions-message-channel-r2 branch from 10ed4ec to 9cf5bcd Compare March 2, 2026 22:48
@steipete steipete merged commit f25be78 into openclaw:main Mar 2, 2026
19 of 21 checks passed
@steipete
Copy link
Contributor

steipete commented Mar 2, 2026

Landed via temp rebase onto main.

  • Gate: pnpm -s vitest run src/gateway/openai-http.message-channel.test.ts src/gateway/openai-http.test.ts
  • Land commit: 9cf5bcd
  • Merge commit: f25be78

Thanks @bmendonca3!

dawi369 pushed a commit to dawi369/davis that referenced this pull request Mar 3, 2026
OWALabuy pushed a commit to kcinzgg/openclaw that referenced this pull request Mar 4, 2026
zooqueen pushed a commit to hanzoai/bot that referenced this pull request Mar 6, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

gateway Gateway runtime size: S

Projects

None yet

Development

Successfully merging this pull request may close these issues.

[Bug]: /v1/chat/completions ignores x-openclaw-message-channel header

2 participants