-
Notifications
You must be signed in to change notification settings - Fork 6.4k
Closed
Labels
perfIndicates a performance issue or need for optimizationIndicates a performance issue or need for optimization
Description
Summary
For Codex OAuth sessions using GPT-5, the same ~300-line prompt content is sent twice:
- In the
instructionsAPI parameter (fromcodex_header.txt) - As the first user message (from
codex.txt)
Evidence
The two files are effectively identical (only trailing whitespace differences):
$ diff packages/opencode/src/session/prompt/codex_header.txt packages/opencode/src/session/prompt/codex.txt
137c137
< - Use the \`edit\` tool to edit files
---
> - Use the \`edit\` tool to edit files
191c191
< If the codebase has tests or the ability to build or run, consider using them to verify that your work is complete.
---
> If the codebase has tests or the ability to build or run, consider using them to verify that your work is complete. Code Path
codex_header.txtis loaded viaPROMPT_CODEX_INSTRUCTIONSatpackages/opencode/src/session/system.ts:17codex.txtis loaded viaPROMPT_CODEXatpackages/opencode/src/session/system.ts:16
For Codex sessions (provider.id === "openai" && auth?.type === "oauth"):
codex_header.txt→ set asoptions.instructionsatllm.ts:103codex.txt→ included in the system array which becomes the first user message atllm.ts:191-197
Impact
Wasteful token usage—sending the same ~5K tokens twice per request.
Suggested Fix
Either:
- Differentiate the files:
codex_header.txtshould be lightweight operational guidance,codex.txtthe full prompt - Or skip one when both would be identical
- Or clarify if this duplication is intentional (caching/prioritization reasons)
Metadata
Metadata
Assignees
Labels
perfIndicates a performance issue or need for optimizationIndicates a performance issue or need for optimization