Skip to content

fix(codex): normalize openai-codex transport in all remaining runtime paths#41450

Closed
jackal092927 wants to merge 17 commits intoopenclaw:mainfrom
jackal092927:fix/codex-transport-path
Closed

fix(codex): normalize openai-codex transport in all remaining runtime paths#41450
jackal092927 wants to merge 17 commits intoopenclaw:mainfrom
jackal092927:fix/codex-transport-path

Conversation

@jackal092927
Copy link
Copy Markdown
Contributor

Problem

After #38736, openai-codex/gpt-5.4 still times out in some runtime paths. Model discovery, media tools, and image understanding use the stale openai-responses transport/api.openai.com base URL instead of the correct openai-codex-responses/chatgpt.com/backend-api combination. (fixes #41282)

Root Cause

normalizeResolvedProviderModel() only lived in the embedded runner's local module. Other code paths (model registry, media tools, image provider) resolved models from the registry without normalization, so openai-codex models got stale transport config.

Solution

  • Hoist normalizeResolvedProviderModel() to a shared model.provider-normalization.ts module
  • Wrap ModelRegistry with a Proxy that auto-normalizes find()/getAll()/getAvailable() results
  • Apply normalization in media-tool-shared.ts and image.ts model resolution
  • Custom proxy endpoints (non-OpenAI base URLs) are preserved unchanged

Testing

  • 9/9 tests pass (new + existing)
  • Test coverage: model registry normalization, media tool resolution, image understanding, custom proxy preservation
  • 8 files changed, 342 insertions, 66 deletions

@openclaw-barnacle openclaw-barnacle Bot added agents Agent runtime and tooling size: M labels Mar 9, 2026
@greptile-apps
Copy link
Copy Markdown
Contributor

greptile-apps Bot commented Mar 9, 2026

Greptile Summary

This PR successfully hoists the normalizeResolvedProviderModel helper into a shared model.provider-normalization.ts module and applies it across all remaining runtime paths that were previously using stale openai-responses/api.openai.com transport for openai-codex models.

Key changes:

  • Shared normalization module extracted from embedded-runner's local implementation
  • discoverModels() wraps ModelRegistry in a Proxy that auto-normalizes find(), getAll(), and getAvailable() results
  • Media tools and image understanding now call normalizeResolvedProviderModel() explicitly after model resolution
  • Custom proxy endpoints (non-OpenAI base URLs) are preserved unchanged

Test coverage: All 9 tests pass, covering model registry normalization, media tool resolution, image understanding end-to-end, and custom proxy preservation. The fix directly addresses the timeout issue reported in #41282.

Confidence Score: 5/5

  • Safe to merge. Normalization logic is idempotent and all targeted timeout paths are correctly fixed with comprehensive test coverage.
  • All 9 tests pass, including tests for the Proxy-based registry normalization, media tool resolution, image understanding, and custom proxy preservation. The solution is well-targeted to the actual problem (stale openai-codex transport) and introduces no new risks. The normalization function is idempotent, preventing correctness regression from being applied in multiple places.
  • No files require special attention.

Last reviewed commit: 1b266e0

… paths

After openclaw#38736, openai-codex/gpt-5.4 still timed out in some paths
because model discovery, media tools, and image understanding used
the stale openai-responses transport instead of openai-codex-responses.

Hoist normalizeResolvedProviderModel to a shared module and apply it
at model discovery (Proxy wrapper on ModelRegistry), media tool
resolution, and image understanding model lookup.

Fixes openclaw#41282
@jackal092927 jackal092927 force-pushed the fix/codex-transport-path branch from 1b266e0 to 49bc9cf Compare March 9, 2026 22:42
@jackal092927 jackal092927 force-pushed the fix/codex-transport-path branch from 49bc9cf to 1eff16a Compare March 10, 2026 02:24
@openclaw-barnacle openclaw-barnacle Bot added app: macos App: macos size: L and removed size: M labels Mar 10, 2026
@openclaw-barnacle openclaw-barnacle Bot removed the app: macos App: macos label Mar 12, 2026
Copy link
Copy Markdown

@chatgpt-codex-connector chatgpt-codex-connector Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codex Review

Here are some automated review suggestions for this pull request.

Reviewed commit: c224f7e1fa

ℹ️ About Codex in GitHub

Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".

Comment on lines 53 to 55
label: "r: no-ci-pr",
close: true,
message:
"Please don't make PRs for test failures on main.\n\n" +
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

P1 Badge Re-add close flag for r: no-ci-pr

This rule no longer sets close: true, but the script only closes items when rule.close is truthy in the common close block later in the workflow. In practice, labeling a PR with r: no-ci-pr now posts the warning comment but leaves the PR open, so the auto-triage path for CI-only fix PRs stops working.

Useful? React with 👍 / 👎.

Comment on lines 394 to 395
const invalidLabel = "invalid";
const spamLabel = "r: spam";
const dirtyLabel = "dirty";
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

P2 Badge Reinstate r: spam close-and-lock handling

Removing spamLabel here also removed the only explicit branches that closed and locked spam-labeled issues/PRs with lock_reason: "spam". Because r: spam is not part of the generic rules array, applying that label now has no moderation effect and spam items remain open/unlocked.

Useful? React with 👍 / 👎.

Comment on lines -3 to -6
on:
push:
tags:
- "v*"
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

P1 Badge Preserve tag-triggered npm publish workflow

Deleting this workflow removes the repository’s tag-triggered npm release path (on.push.tags: "v*" + npm publish). I checked the remaining files under .github/workflows and there is no other workflow containing an npm publish step, so pushing a release tag will no longer publish openclaw to npm.

Useful? React with 👍 / 👎.

@openclaw-barnacle openclaw-barnacle Bot added the app: macos App: macos label Mar 12, 2026
Copy link
Copy Markdown

@chatgpt-codex-connector chatgpt-codex-connector Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codex Review

Here are some automated review suggestions for this pull request.

Reviewed commit: 9839dc9a95

ℹ️ About Codex in GitHub

Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".

label: String?,
spawnedby: String?,
workspacedir: String?)
label: String?)
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

P2 Badge Restore spawnedBy/workspaceDir in Swift AgentParams model

AgentParams no longer exposes spawnedBy and workspaceDir, but the gateway contract still accepts both (src/gateway/protocol/schema/agent.ts) and src/gateway/server-methods/agent.ts uses them to preserve subagent lineage and workspace overrides. With these fields removed from the generated Swift model, OpenClawKit/macOS clients cannot send values the backend still depends on, so spawned runs lose metadata/behavior that previously worked.

Useful? React with 👍 / 👎.

Copy link
Copy Markdown

@chatgpt-codex-connector chatgpt-codex-connector Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codex Review

Here are some automated review suggestions for this pull request.

Reviewed commit: 21900aeb25

ℹ️ About Codex in GitHub

Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".

case execnode = "execNode"
case model
case spawnedby = "spawnedBy"
case spawnedworkspacedir = "spawnedWorkspaceDir"
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

P2 Badge Remove unsupported spawnedWorkspaceDir patch field

SessionsPatchParams now encodes spawnedWorkspaceDir, but the server contract in src/gateway/protocol/schema/sessions.ts does not define that property and uses additionalProperties: false for SessionsPatchParamsSchema. In practice, any Swift client that sets spawnedworkspacedir will send a payload the gateway rejects as an unknown key, so this generated model is out of sync with the runtime API.

Useful? React with 👍 / 👎.

# Conflicts:
#	src/agents/pi-embedded-runner/model.provider-normalization.ts
#	src/agents/tools/media-tool-shared.ts
@openclaw-barnacle openclaw-barnacle Bot removed the app: macos App: macos label Mar 20, 2026
@openclaw-barnacle openclaw-barnacle Bot added the docs Improvements or additions to documentation label Mar 20, 2026
@openclaw-barnacle openclaw-barnacle Bot added the channel: line Channel integration: line label Mar 20, 2026
@openclaw-barnacle openclaw-barnacle Bot removed the docs Improvements or additions to documentation label Mar 20, 2026
Copy link
Copy Markdown

@chatgpt-codex-connector chatgpt-codex-connector Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codex Review

Here are some automated review suggestions for this pull request.

Reviewed commit: 038b588dd0

ℹ️ About Codex in GitHub

Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".

Comment thread extensions/line/runtime-api.ts Outdated
// Keep this barrel thin and aligned with the local extension surface.

export * from "../../src/plugin-sdk/line.js";
export type { OpenClawConfig } from "openclaw/plugin-sdk/line-core";
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

P1 Badge Restore LINE runtime-api wildcard re-export

Removing export * from runtime-api.ts dropped exports that extensions/line/api.ts still forwards (for example createActionCard, processLineMessage, CardAction, ListItem, and LineChannelData). Because LINE sources import through ../api.js (for example extensions/line/src/card-command.ts), importing the plugin now fails when api.ts tries to re-export names that no longer exist in runtime-api.ts, which breaks LINE plugin loading.

Useful? React with 👍 / 👎.

@openclaw-barnacle openclaw-barnacle Bot removed the channel: line Channel integration: line label Mar 20, 2026
Copy link
Copy Markdown

@chatgpt-codex-connector chatgpt-codex-connector Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codex Review

Here are some automated review suggestions for this pull request.

Reviewed commit: 7f5d0b6dde

ℹ️ About Codex in GitHub

Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".

export function discoverModels(authStorage: PiAuthStorage, agentDir: string): PiModelRegistry {
return new PiModelRegistryClass(authStorage, path.join(agentDir, "models.json"));
const registry = new PiModelRegistryClass(authStorage, path.join(agentDir, "models.json"));
return wrapModelRegistryWithProviderNormalization(registry);
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

P2 Badge Defer registry transport normalization until after overrides

Applying wrapModelRegistryWithProviderNormalization() in discoverModels() normalizes model transport before provider config overrides run. In resolveExplicitModelWithRegistry, applyConfiguredProviderOverrides replaces baseUrl but preserves the discovered api when providerConfig.api is unset (src/agents/pi-embedded-runner/model.ts:133-135,195-210). With older models.json entries (the same stale case this patch targets), an OpenAI model can be converted to openai-responses first, then have a custom proxy baseUrl injected, producing openai-responses on a non-OpenAI endpoint that previously stayed openai-completions. That is a behavior regression for custom OpenAI-compatible proxies unless users now also set api explicitly.

Useful? React with 👍 / 👎.

@steipete
Copy link
Copy Markdown
Contributor

steipete commented Apr 4, 2026

Thanks for the patch. I pulled latest main and checked the current tree; the same normalization is now handled through the shared registry/plugin runtime path rather than the older local shim from this PR.

Current main already has:

  • registry-wide normalization wrapping find/getAll/getAvailable in src/agents/pi-model-discovery.ts:59-137
  • embedded normalization still funneled through src/agents/pi-embedded-runner/model.provider-normalization.ts:4-8
  • image/media callers using discoverModels(...), so they pick up the normalized registry path (src/media-understanding/image.ts:47-68, src/agents/tools/pdf-tool.ts:186-203)
  • current Codex transport assertions in src/agents/pi-embedded-runner/model.test.ts:870-889

Closing as superseded by the current shared transport/runtime normalization on main. Thanks again.

@steipete steipete closed this Apr 4, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

agents Agent runtime and tooling channel: whatsapp-web Channel integration: whatsapp-web size: M

Projects

None yet

Development

Successfully merging this pull request may close these issues.

[Bug]: Regression/incomplete fix: openai-codex still times out on GPT-5.4 after #38736 due to remaining non-codex transport path

3 participants