Skip to content

Commit 0c9395b

Browse files
authored
feat(provider/openai): add gpt-5.3-codex (#12814)
## Background OpenAI released `gpt-5.3-codex` in the API: https://x.com/OpenAIDevs/status/2026379092661289260?s=20 This PR adds support for it. ## Summary Adds `gpt-5.3-codex` to the OpenAI and Gateway provider model lists. Also adds `gpt-5.2-codex` to the OpenAI model list, which was previously missing. Docs referencing `gpt-5.2-codex` were updated as well to include `gpt-5.3-codex`. ## Checklist - [ ] Tests have been added / updated (for bug fixes / features) - [x] Documentation has been added / updated (for bug fixes / features) - [x] A _patch_ changeset for relevant packages has been added (for bug fixes / features - run `pnpm changeset` in the project root) - [x] I have reviewed this pull request (self-review) ## Future Work N/A ## Related Issues N/A
1 parent 115842b commit 0c9395b

File tree

5 files changed

+14
-1
lines changed

5 files changed

+14
-1
lines changed

.changeset/silver-camels-rule.md

Lines changed: 6 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,6 @@
1+
---
2+
'@ai-sdk/gateway': patch
3+
'@ai-sdk/openai': patch
4+
---
5+
6+
feat(provider/openai): add `gpt-5.3-codex`

content/providers/03-community-providers/13-codex-cli.mdx

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -91,7 +91,7 @@ const model = codexCli('gpt-5.2-codex');
9191

9292
**Current Generation Models:**
9393

94-
- **gpt-5.2-codex**: Latest agentic coding model
94+
- **gpt-5.3-codex**: Latest agentic coding model
9595
- **gpt-5.2**: Latest general purpose model
9696
- **gpt-5.1-codex-max**: Flagship model with deep reasoning (supports `xhigh` reasoning)
9797
- **gpt-5.1-codex-mini**: Lightweight, faster variant
@@ -135,6 +135,7 @@ const model = codexCli('gpt-5.1-codex-max', {
135135

136136
| Model | Image Input | Object Generation | Tool Usage | Tool Streaming |
137137
| -------------------- | ------------------- | ------------------- | ------------------- | ------------------- |
138+
| `gpt-5.3-codex` | <Check size={18} /> | <Check size={18} /> | <Cross size={18} /> | <Cross size={18} /> |
138139
| `gpt-5.2-codex` | <Check size={18} /> | <Check size={18} /> | <Cross size={18} /> | <Cross size={18} /> |
139140
| `gpt-5.2` | <Check size={18} /> | <Check size={18} /> | <Cross size={18} /> | <Cross size={18} /> |
140141
| `gpt-5.1-codex-max` | <Check size={18} /> | <Check size={18} /> | <Cross size={18} /> | <Cross size={18} /> |

content/providers/03-community-providers/46-codex-app-server.mdx

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -184,6 +184,7 @@ const result = await streamText({
184184

185185
| Model | Image Input | Object Generation | Tool Streaming | Mid-Execution |
186186
| -------------------- | ------------------- | ------------------- | ------------------- | ------------------- |
187+
| `gpt-5.3-codex` | <Check size={18} /> | <Check size={18} /> | <Check size={18} /> | <Check size={18} /> |
187188
| `gpt-5.2-codex` | <Check size={18} /> | <Check size={18} /> | <Check size={18} /> | <Check size={18} /> |
188189
| `gpt-5.1-codex-max` | <Check size={18} /> | <Check size={18} /> | <Check size={18} /> | <Check size={18} /> |
189190
| `gpt-5.1-codex-mini` | <Check size={18} /> | <Check size={18} /> | <Check size={18} /> | <Check size={18} /> |

packages/gateway/src/gateway-language-model-settings.ts

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -126,6 +126,7 @@ export type GatewayModelId =
126126
| 'openai/gpt-5.2-chat'
127127
| 'openai/gpt-5.2-codex'
128128
| 'openai/gpt-5.2-pro'
129+
| 'openai/gpt-5.3-codex'
129130
| 'openai/gpt-oss-120b'
130131
| 'openai/gpt-oss-20b'
131132
| 'openai/gpt-oss-safeguard-20b'

packages/openai/src/responses/openai-responses-options.ts

Lines changed: 4 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -42,6 +42,8 @@ export const openaiResponsesReasoningModelIds = [
4242
'gpt-5.2',
4343
'gpt-5.2-chat-latest',
4444
'gpt-5.2-pro',
45+
'gpt-5.2-codex',
46+
'gpt-5.3-codex',
4547
] as const;
4648

4749
export const openaiResponsesModelIds = [
@@ -110,6 +112,8 @@ export type OpenAIResponsesModelId =
110112
| 'gpt-5.2'
111113
| 'gpt-5.2-chat-latest'
112114
| 'gpt-5.2-pro'
115+
| 'gpt-5.2-codex'
116+
| 'gpt-5.3-codex'
113117
| 'gpt-5-2025-08-07'
114118
| 'gpt-5-chat-latest'
115119
| 'gpt-5-codex'

0 commit comments

Comments
 (0)