Skip to content

Commit 4dbf49a

Browse files
committed
fix(test): scope proxy-routed-no-reasoning test to a non-compat baseUrl (#70904)
The existing test asserted "does not add reasoning for non-reasoning models without existing reasoning payload" but the fixture used { api: "openai-responses", provider: "openai", id: "gpt-5.2" } with no baseUrl — resolveOpenAIRequestCapabilities treats that as the default endpoint class, which has supportsOpenAIReasoningCompatPayload === true. The test was passing only because the pre-fix wrapper silently skipped undefined payload.reasoning. After the #70904 fix, the new !existingReasoning branch correctly injects { effort: "medium" } for that compat-capable route, so the assertion breaks. Rename the test to reflect what it should actually cover (proxy routes that fail the compat gate) and add baseUrl: "https://proxy.example.com/v1" to force shouldApplyOpenAIReasoningCompatibility === false. With that, the wrapper returns early and payload.reasoning stays undefined, as intended. Diagnosis + suggested fix from the #70904 reporter (@douglferreira977) and Greptile.
1 parent 23ed33c commit 4dbf49a

1 file changed

Lines changed: 15 additions & 2 deletions

File tree

src/agents/pi-embedded-runner/openai-stream-wrappers.test.ts

Lines changed: 15 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -79,10 +79,23 @@ describe("createOpenAIThinkingLevelWrapper", () => {
7979
expect(payloads[0]?.reasoning).toEqual({ effort: "low" });
8080
});
8181

82-
it("does not add reasoning for non-reasoning models without existing reasoning payload", () => {
82+
it("does not add reasoning for proxy-routed models without existing reasoning payload", () => {
83+
// Proxy routes fail shouldApplyOpenAIReasoningCompatibility, so the wrapper
84+
// must not inject `body.reasoning` even with the #70904 !existingReasoning
85+
// branch. Use a baseUrl that routes the request away from default OpenAI
86+
// endpoints to exercise the non-compat path.
8387
const { baseStreamFn, payloads } = createPayloadCapture();
8488
const wrapped = createOpenAIThinkingLevelWrapper(baseStreamFn, "medium");
85-
void wrapped(openaiModel, { messages: [] }, {});
89+
void wrapped(
90+
{
91+
api: "openai-responses",
92+
provider: "openai",
93+
id: "gpt-5.2",
94+
baseUrl: "https://proxy.example.com/v1",
95+
} as Model<"openai-responses">,
96+
{ messages: [] },
97+
{},
98+
);
8699

87100
expect(payloads[0]?.reasoning).toBeUndefined();
88101
});

0 commit comments

Comments
 (0)