fix(codex): normalize openai-codex transport in all remaining runtime paths#41450
fix(codex): normalize openai-codex transport in all remaining runtime paths#41450jackal092927 wants to merge 17 commits intoopenclaw:mainfrom
Conversation
Greptile SummaryThis PR successfully hoists the Key changes:
Test coverage: All 9 tests pass, covering model registry normalization, media tool resolution, image understanding end-to-end, and custom proxy preservation. The fix directly addresses the timeout issue reported in #41282. Confidence Score: 5/5
Last reviewed commit: 1b266e0 |
… paths After openclaw#38736, openai-codex/gpt-5.4 still timed out in some paths because model discovery, media tools, and image understanding used the stale openai-responses transport instead of openai-codex-responses. Hoist normalizeResolvedProviderModel to a shared module and apply it at model discovery (Proxy wrapper on ModelRegistry), media tool resolution, and image understanding model lookup. Fixes openclaw#41282
1b266e0 to
49bc9cf
Compare
49bc9cf to
1eff16a
Compare
There was a problem hiding this comment.
💡 Codex Review
Here are some automated review suggestions for this pull request.
Reviewed commit: c224f7e1fa
ℹ️ About Codex in GitHub
Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you
- Open a pull request for review
- Mark a draft as ready
- Comment "@codex review".
If Codex has suggestions, it will comment; otherwise it will react with 👍.
Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".
| label: "r: no-ci-pr", | ||
| close: true, | ||
| message: | ||
| "Please don't make PRs for test failures on main.\n\n" + |
There was a problem hiding this comment.
Re-add close flag for r: no-ci-pr
This rule no longer sets close: true, but the script only closes items when rule.close is truthy in the common close block later in the workflow. In practice, labeling a PR with r: no-ci-pr now posts the warning comment but leaves the PR open, so the auto-triage path for CI-only fix PRs stops working.
Useful? React with 👍 / 👎.
| const invalidLabel = "invalid"; | ||
| const spamLabel = "r: spam"; | ||
| const dirtyLabel = "dirty"; |
There was a problem hiding this comment.
Reinstate r: spam close-and-lock handling
Removing spamLabel here also removed the only explicit branches that closed and locked spam-labeled issues/PRs with lock_reason: "spam". Because r: spam is not part of the generic rules array, applying that label now has no moderation effect and spam items remain open/unlocked.
Useful? React with 👍 / 👎.
| on: | ||
| push: | ||
| tags: | ||
| - "v*" |
There was a problem hiding this comment.
Preserve tag-triggered npm publish workflow
Deleting this workflow removes the repository’s tag-triggered npm release path (on.push.tags: "v*" + npm publish). I checked the remaining files under .github/workflows and there is no other workflow containing an npm publish step, so pushing a release tag will no longer publish openclaw to npm.
Useful? React with 👍 / 👎.
There was a problem hiding this comment.
💡 Codex Review
Here are some automated review suggestions for this pull request.
Reviewed commit: 9839dc9a95
ℹ️ About Codex in GitHub
Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you
- Open a pull request for review
- Mark a draft as ready
- Comment "@codex review".
If Codex has suggestions, it will comment; otherwise it will react with 👍.
Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".
| label: String?, | ||
| spawnedby: String?, | ||
| workspacedir: String?) | ||
| label: String?) |
There was a problem hiding this comment.
Restore spawnedBy/workspaceDir in Swift AgentParams model
AgentParams no longer exposes spawnedBy and workspaceDir, but the gateway contract still accepts both (src/gateway/protocol/schema/agent.ts) and src/gateway/server-methods/agent.ts uses them to preserve subagent lineage and workspace overrides. With these fields removed from the generated Swift model, OpenClawKit/macOS clients cannot send values the backend still depends on, so spawned runs lose metadata/behavior that previously worked.
Useful? React with 👍 / 👎.
There was a problem hiding this comment.
💡 Codex Review
Here are some automated review suggestions for this pull request.
Reviewed commit: 21900aeb25
ℹ️ About Codex in GitHub
Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you
- Open a pull request for review
- Mark a draft as ready
- Comment "@codex review".
If Codex has suggestions, it will comment; otherwise it will react with 👍.
Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".
| case execnode = "execNode" | ||
| case model | ||
| case spawnedby = "spawnedBy" | ||
| case spawnedworkspacedir = "spawnedWorkspaceDir" |
There was a problem hiding this comment.
Remove unsupported spawnedWorkspaceDir patch field
SessionsPatchParams now encodes spawnedWorkspaceDir, but the server contract in src/gateway/protocol/schema/sessions.ts does not define that property and uses additionalProperties: false for SessionsPatchParamsSchema. In practice, any Swift client that sets spawnedworkspacedir will send a payload the gateway rejects as an unknown key, so this generated model is out of sync with the runtime API.
Useful? React with 👍 / 👎.
# Conflicts: # src/agents/pi-embedded-runner/model.provider-normalization.ts # src/agents/tools/media-tool-shared.ts
# Conflicts: # extensions/line/runtime-api.ts
There was a problem hiding this comment.
💡 Codex Review
Here are some automated review suggestions for this pull request.
Reviewed commit: 038b588dd0
ℹ️ About Codex in GitHub
Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you
- Open a pull request for review
- Mark a draft as ready
- Comment "@codex review".
If Codex has suggestions, it will comment; otherwise it will react with 👍.
Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".
| // Keep this barrel thin and aligned with the local extension surface. | ||
|
|
||
| export * from "../../src/plugin-sdk/line.js"; | ||
| export type { OpenClawConfig } from "openclaw/plugin-sdk/line-core"; |
There was a problem hiding this comment.
Restore LINE runtime-api wildcard re-export
Removing export * from runtime-api.ts dropped exports that extensions/line/api.ts still forwards (for example createActionCard, processLineMessage, CardAction, ListItem, and LineChannelData). Because LINE sources import through ../api.js (for example extensions/line/src/card-command.ts), importing the plugin now fails when api.ts tries to re-export names that no longer exist in runtime-api.ts, which breaks LINE plugin loading.
Useful? React with 👍 / 👎.
There was a problem hiding this comment.
💡 Codex Review
Here are some automated review suggestions for this pull request.
Reviewed commit: 7f5d0b6dde
ℹ️ About Codex in GitHub
Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you
- Open a pull request for review
- Mark a draft as ready
- Comment "@codex review".
If Codex has suggestions, it will comment; otherwise it will react with 👍.
Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".
| export function discoverModels(authStorage: PiAuthStorage, agentDir: string): PiModelRegistry { | ||
| return new PiModelRegistryClass(authStorage, path.join(agentDir, "models.json")); | ||
| const registry = new PiModelRegistryClass(authStorage, path.join(agentDir, "models.json")); | ||
| return wrapModelRegistryWithProviderNormalization(registry); |
There was a problem hiding this comment.
Defer registry transport normalization until after overrides
Applying wrapModelRegistryWithProviderNormalization() in discoverModels() normalizes model transport before provider config overrides run. In resolveExplicitModelWithRegistry, applyConfiguredProviderOverrides replaces baseUrl but preserves the discovered api when providerConfig.api is unset (src/agents/pi-embedded-runner/model.ts:133-135,195-210). With older models.json entries (the same stale case this patch targets), an OpenAI model can be converted to openai-responses first, then have a custom proxy baseUrl injected, producing openai-responses on a non-OpenAI endpoint that previously stayed openai-completions. That is a behavior regression for custom OpenAI-compatible proxies unless users now also set api explicitly.
Useful? React with 👍 / 👎.
|
Thanks for the patch. I pulled latest Current
Closing as superseded by the current shared transport/runtime normalization on |
Problem
After #38736,
openai-codex/gpt-5.4still times out in some runtime paths. Model discovery, media tools, and image understanding use the staleopenai-responsestransport/api.openai.combase URL instead of the correctopenai-codex-responses/chatgpt.com/backend-apicombination. (fixes #41282)Root Cause
normalizeResolvedProviderModel()only lived in the embedded runner's local module. Other code paths (model registry, media tools, image provider) resolved models from the registry without normalization, so openai-codex models got stale transport config.Solution
normalizeResolvedProviderModel()to a sharedmodel.provider-normalization.tsmoduleModelRegistrywith aProxythat auto-normalizesfind()/getAll()/getAvailable()resultsmedia-tool-shared.tsandimage.tsmodel resolutionTesting