Skip to content

Add support for the OpenAI API#242

Open
levigross wants to merge 3 commits intogoogle:mainfrom
levigross:main
Open

Add support for the OpenAI API#242
levigross wants to merge 3 commits intogoogle:mainfrom
levigross:main

Conversation

@levigross
Copy link

This PR enables basic support for OpenAI models (and endpoints that expose a OpenAI API compatible API interface).

I am stressing on basic support because we will leave tool calling to the next PR :)

@gemini-code-assist
Copy link
Contributor

Summary of Changes

Hello @levigross, I'm Gemini Code Assist1! I'm currently reviewing this pull request and will post my feedback shortly. In the meantime, here's a summary to help you and other reviewers quickly get up to speed!

This pull request lays the groundwork for integrating OpenAI models into the system. It establishes the necessary components to send requests, process responses, and manage streaming interactions with OpenAI's API, ensuring compatibility with existing LLM interfaces. The current implementation focuses on core generation capabilities, with plans to extend support for advanced features like tool calling in future iterations.

Highlights

  • OpenAI API Integration: Introduced basic support for interacting with the OpenAI API, enabling the system to send requests to OpenAI models and compatible endpoints.
  • Dependency Management: Added the official github.com/openai/openai-go/v3 Go client library and several github.com/tidwall JSON utility libraries to go.mod and go.sum.
  • Core OpenAI Model Implementation: Created a new openai package under model/ with openAIModel implementing the model.LLM interface, handling both streaming and non-streaming content generation.
  • Request and Response Conversion: Developed comprehensive logic to convert generic LLMRequest objects into OpenAI-specific request parameters and to transform OpenAI API responses back into the generic genai.GenerateContentResponse format.
  • Streaming Event Handling: Implemented a streamTranslator to process and buffer OpenAI streaming events, particularly for accumulating function call arguments, ensuring proper reconstruction of streamed responses.
  • Error Handling and Constants: Defined specific error types (errors.go) and event type constants (consts.go) for the OpenAI integration to improve clarity and maintainability.
Using Gemini Code Assist

The full guide for Gemini Code Assist can be found on our documentation page, here are some quick tips.

Invoking Gemini

You can request assistance from Gemini at any point by creating a comment using either /gemini <command> or @gemini-code-assist <command>. Below is a summary of the supported commands on the current page.

Feature Command Description
Code Review /gemini review Performs a code review for the current pull request in its current state.
Pull Request Summary /gemini summary Provides a summary of the current pull request in its current state.
Comment @gemini-code-assist Responds in comments when explicitly tagged, both in pull request comments and review comments.
Help /gemini help Displays a list of available commands.

Customization

To customize Gemini Code Assist for GitHub experience, repository maintainers can create a configuration file and/or provide a custom code review style guide (such as PEP-8 for Python) by creating and adding files to a .gemini/ folder in the base of the repository. Detailed instructions can be found here.

Limitations & Feedback

Gemini Code Assist may make mistakes. Please leave feedback on any instances where its feedback is incorrect or counter productive. You can react with 👍 and 👎 on @gemini-code-assist comments. If you're interested in giving your feedback about your experience with Gemini Code Assist for Github and other Google products, sign up here.

You can also get AI-powered code generation, chat, as well as code reviews directly in the IDE at no cost with the Gemini Code Assist IDE Extension.

Footnotes

  1. Review the Privacy Notices, Generative AI Prohibited Use Policy, Terms of Service, and learn how to configure Gemini Code Assist in GitHub here. Gemini can make mistakes, so double check it and use code with caution.

Copy link
Contributor

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request introduces basic support for the OpenAI API by creating an adapter that conforms to the project's model.LLM interface. The implementation is well-structured, separating concerns like request building, response parsing, and stream handling into their own files. However, I've identified a few critical issues, including a couple of compilation errors and incorrect pointer semantics in the model constructor. I also found a potential race condition in stream handling and a gap in error handling for function call tracking. Addressing these points will significantly improve the robustness and correctness of this new integration.

return m.generate(ctx, params)
}

func (m *openAIModel) generate(ctx context.Context, params responses.ResponseNewParams) iter.Seq2[*model.LLMResponse, error] {
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Could you please support chat completion? it's better for compatible with most providers.

This PR enables basic support for OpenAI models (and endpoints that expose
a OpenAI API compatible API interface).

I am stressing on *basic* support because we will leave tool calling to
the next PR :)
@levigross
Copy link
Author

@gemini-code-assist please rereview this code

Copy link
Contributor

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request introduces a comprehensive and well-structured adapter for the OpenAI API, implementing the model.LLM interface. The implementation covers request and response conversion, streaming support, and robust error handling for unsupported features. The code is well-tested with good unit and integration test coverage.

My main feedback is a critical issue in NewModel that will prevent compilation due to an attempt to access an unexported field on the openai.Client struct. I've provided suggestions to refactor this to use a pointer to the client, which is a more idiomatic and robust approach for validation in Go. Once this is addressed, the package should be in great shape.

Comment on lines +34 to +45
func NewModel(_ context.Context, modelName string, client openai.Client) (model.LLM, error) {
// We drop the context because OpenAI doesn't take a context when creating a new Client
if modelName == "" {
return nil, ErrModelNameRequired
}
if len(client.Options) == 0 {
return nil, ErrClientRequired
}
return &openAIModel{
client: &client,
name: modelName,
}, nil
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

critical

To fix the compilation issue with client.Options and improve the function's contract, the signature of NewModel should be changed to accept a pointer to an openai.Client. This allows for a nil check to validate the client's presence.

Consequently, the openAIModel struct should store this pointer directly.

Suggested change
func NewModel(_ context.Context, modelName string, client openai.Client) (model.LLM, error) {
// We drop the context because OpenAI doesn't take a context when creating a new Client
if modelName == "" {
return nil, ErrModelNameRequired
}
if len(client.Options) == 0 {
return nil, ErrClientRequired
}
return &openAIModel{
client: &client,
name: modelName,
}, nil
func NewModel(_ context.Context, modelName string, client *openai.Client) (model.LLM, error) {
// We drop the context because OpenAI doesn't take a context when creating a new Client
if modelName == "" {
return nil, ErrModelNameRequired
}
if client == nil {
return nil, ErrClientRequired
}
return &openAIModel{
client: client,
name: modelName,
}, nil
}

option.WithBaseURL(server.URL+"/v1"),
)

llm, err := NewModel(t.Context(), openai.ChatModelGPT4oMini, client)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

critical

To align with the proposed change of NewModel accepting a pointer to openai.Client, you need to pass the address of the client here.

Suggested change
llm, err := NewModel(t.Context(), openai.ChatModelGPT4oMini, client)
llm, err := NewModel(t.Context(), openai.ChatModelGPT4oMini, &client)

Comment on lines +89 to +107
tests := []struct {
name string
modelName openai.ChatModel
client openai.Client
wantErr error
}{
{
name: "missing model name",
modelName: "",
client: client,
wantErr: ErrModelNameRequired,
},
{
name: "missing client",
modelName: openai.ChatModelGPT4o,
client: openai.Client{},
wantErr: ErrClientRequired,
},
}
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

critical

Following the suggested change in openai.go to use a pointer for the openai.Client, this test table needs to be updated to reflect the new signature of NewModel. The client field in the test struct should be a pointer, and the test cases should be adjusted accordingly. For the "missing client" case, nil should be passed to trigger ErrClientRequired.

Suggested change
tests := []struct {
name string
modelName openai.ChatModel
client openai.Client
wantErr error
}{
{
name: "missing model name",
modelName: "",
client: client,
wantErr: ErrModelNameRequired,
},
{
name: "missing client",
modelName: openai.ChatModelGPT4o,
client: openai.Client{},
wantErr: ErrClientRequired,
},
}
tests := []struct {
name string
modelName openai.ChatModel
client *openai.Client
wantErr error
}{
{
name: "missing model name",
modelName: "",
client: &client,
wantErr: ErrModelNameRequired,
},
{
name: "missing client",
modelName: openai.ChatModelGPT4o,
client: nil,
wantErr: ErrClientRequired,
},
}

@levigross
Copy link
Author

levigross commented Nov 18, 2025

@dpasiukevich What do I need to do to get this PR reviewed? 😀

@YouChenJun
Copy link

Thank you for contributing this PR to add support for the OpenAI API! This is a highly valuable feature that will greatly improve the flexibility of the project.
We noticed there are some merge conflicts now. Could you please help resolve them when you have time? This will help move the PR forward for merging.
Let us know if you need any support from our side. Thanks again for your great work! @levigross

1 similar comment
@YouChenJun
Copy link

Thank you for contributing this PR to add support for the OpenAI API! This is a highly valuable feature that will greatly improve the flexibility of the project.
We noticed there are some merge conflicts now. Could you please help resolve them when you have time? This will help move the PR forward for merging.
Let us know if you need any support from our side. Thanks again for your great work! @levigross

@dangkaka
Copy link

hi, any updates on this?

@yafengio
Copy link

hi, any updates ?

@wjkoh
Copy link

wjkoh commented Jan 28, 2026

There are currently two PRs under review and one third-party implementation:

Since the review process for both PRs seems to be lagging, could we merge a basic feature set first and improve it incrementally later? I realize that achieving full OpenAI API compatibility is a significant undertaking, and starting with a core version might be more efficient. @byebyebruce

@dpasiukevich
Copy link
Collaborator

Folks, sry for the lack of reply here.

We're a bit not sure on the best mechanism for long-term support of ecosystem models, let's redirect this to adk-go-community repo to unblock for now.

@dpasiukevich
Copy link
Collaborator

Let me setup the https://github.com/google/adk-go-community and return back here.

Let's keep and grow the stuff there with more models, tools, plugins etc.

@Kenasvarghese
Copy link

@dpasiukevich
Hi, Any update on the adk-go-community
or any other solutions to use openai models

@YouChenJun
Copy link

YouChenJun commented Feb 16, 2026 via email

@wjkoh
Copy link

wjkoh commented Feb 23, 2026

Let me setup the https://github.com/google/adk-go-community and return back here.

Gentle reminder: Is the adk-go-community repository still coming, or has the decision changed?

@dharmjit
Copy link

Any updates on the way forward?

@marc1161
Copy link

Same here, any updates on this?

@honky-tonk
Copy link

Any updates? guys

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.