Skip to content

feat: [#915] add AI module#1410

Merged
hwbrzzl merged 9 commits intomasterfrom
bowen/#915
Mar 14, 2026
Merged

feat: [#915] add AI module#1410
hwbrzzl merged 9 commits intomasterfrom
bowen/#915

Conversation

@hwbrzzl
Copy link
Contributor

@hwbrzzl hwbrzzl commented Mar 14, 2026

Overview

This PR introduces the AI module for Goravel — a provider-agnostic interface for building LLM-powered features in your application.

Features

  • contracts/ai — Public interfaces: AI, Conversation, Provider (driver), Agent, Response, Usage, Message, Option
  • ai/ — Core implementation: Application (manager), conversation, option helpers (WithProvider, WithModel, WithTimeout)
  • ai/service_provider.go — Service provider wired into the IoC container
  • mocks/ai/ — Generated testify mocks for all AI interfaces
  • facades/ — Static AI() facade accessor

Examples

1. Basic prompt

conv, err := facades.AI().
    WithContext(ctx).
    Agent(myAgent)

if err != nil {
    // handle error
}

resp, err := conv.Prompt("What is Goravel?")
fmt.Println(resp.Text(ctx))

2. Choose a specific provider and model at call-time

conv, err := facades.AI().
    WithContext(ctx).
    Agent(myAgent, ai.WithProvider("openai"), ai.WithModel("gpt-4o"))

resp, _ := conv.Prompt("Summarise the Goravel documentation.")
fmt.Println(resp.Text(ctx))

// token usage
usage := resp.Usage(ctx)
fmt.Printf("input=%d output=%d total=%d\n",
    usage.InputTokens(ctx), usage.OutputTokens(ctx), usage.TotalTokens(ctx))

3. Multi-turn conversation

conv, _ := facades.AI().WithContext(ctx).Agent(myAgent)

r1, _ := conv.Prompt("Hello, my name is Alice.")
fmt.Println(r1.Text(ctx)) // "Nice to meet you, Alice!"

r2, _ := conv.Prompt("What is my name?")
fmt.Println(r2.Text(ctx)) // "Your name is Alice."

// inspect full history
for _, msg := range conv.Messages() {
    fmt.Printf("[%s] %s\n", msg.Role, msg.Content)
}

4. Reset conversation history

conv, _ := facades.AI().WithContext(ctx).Agent(myAgent)
conv.Prompt("Remember this: the magic word is banana.")
conv.Reset() // clears runtime history, restores agent seed messages

// fresh start — model no longer knows the magic word
reply, _ := conv.Prompt("What is the magic word?")
fmt.Println(reply.Text(ctx))

5. Custom timeout

import "time"

conv, _ := facades.AI().
    WithContext(ctx).
    Agent(myAgent, ai.WithTimeout(30*time.Second))

reply, _ := conv.Prompt("Write a short story.")
fmt.Println(reply.Text(ctx))

6. Implementing a custom Provider driver

type MyProvider struct{}

func (p *MyProvider) Prompt(
    ctx context.Context,
    agent ai.Agent,
    messages []ai.Message,
    options ...ai.Option,
) (ai.Response, error) {
    // call your LLM API here
    return myResponse, nil
}

Related

Closes #915

Summary by CodeRabbit

  • New Features
    • Added AI framework supporting agent-based conversations and multi-turn interactions
    • Support for configurable AI providers, models, and conversation options
    • Built-in conversation history management and token usage tracking
    • Context-aware operations with conversation reset and message retrieval

@hwbrzzl hwbrzzl requested a review from a team as a code owner March 14, 2026 03:34
@hwbrzzl
Copy link
Contributor Author

hwbrzzl commented Mar 14, 2026

@coderabbitai review

@coderabbitai
Copy link
Contributor

coderabbitai bot commented Mar 14, 2026

✅ Actions performed

Review triggered.

Note: CodeRabbit is an incremental review system and does not re-review already reviewed commits. This command is applicable only when automatic reviews are paused.

@hwbrzzl hwbrzzl closed this Mar 14, 2026
@hwbrzzl hwbrzzl reopened this Mar 14, 2026
@coderabbitai
Copy link
Contributor

coderabbitai bot commented Mar 14, 2026

📝 Walkthrough

Walkthrough

This pull request introduces a comprehensive AI management system for the framework, including contract interfaces for AI orchestration (AI, Conversation, Agent, Provider, Response, Usage, Message, Option), implementations with Application and ServiceProvider, helper option functions, integration with the container and facade system, and extensive auto-generated mock implementations for testing.

Changes

Cohort / File(s) Summary
AI Contract Definitions
contracts/ai/agent.go, contracts/ai/ai.go, contracts/ai/driver.go, contracts/ai/message.go, contracts/ai/option.go, contracts/ai/response.go
Establishes core AI orchestration interfaces including AI manager, Conversation session handling, Agent definitions, Provider for model interaction, Response with token usage metrics, and Message/MessageRole types for conversation history.
AI Implementation
ai/application.go, ai/conversation.go, ai/option.go, ai/service_provider.go
Implements Application as the AI manager with context support, conversation instantiation, and ServiceProvider for dependency injection wiring. Includes placeholder Conversation implementation and option helper constructors (WithProvider, WithModel, WithTimeout).
Container & Facade Integration
contracts/binding/binding.go, contracts/facades/facades.go, contracts/foundation/application.go, facades/facades.go, foundation/container.go
Registers AI binding constant, adds AI facade accessor, extends Application interface with MakeAI method, and implements container resolution for the AI service.
Mock Implementations
mocks/ai/AI.go, mocks/ai/Agent.go, mocks/ai/Ai.go, mocks/ai/Conversation.go, mocks/ai/Option.go, mocks/ai/Provider.go, mocks/ai/ProviderCreator.go, mocks/ai/Response.go, mocks/ai/Usage.go, mocks/foundation/Application.go
Auto-generated testify/mock implementations for all AI interfaces providing typed expectation helpers, Run/Return/RunAndReturn chaining, and integration with testing lifecycle.
Documentation & Tests
AGENTS.md, support/convert/facade_test.go
Updates AI development guidelines and adds assertions validating AI facade and binding mappings.

Sequence Diagram

sequenceDiagram
    participant App as Application
    participant AI as AI Manager
    participant Conv as Conversation
    participant Prov as Provider
    participant Resp as Response
    
    App->>AI: WithContext(ctx)
    activate AI
    AI-->>App: AI instance (new context)
    deactivate AI
    
    App->>AI: Agent(agent, options...)
    activate AI
    AI->>Conv: Create conversation with options
    activate Conv
    Conv-->>AI: Conversation instance
    deactivate Conv
    AI-->>App: Conversation
    deactivate AI
    
    App->>Conv: Prompt(input)
    activate Conv
    Conv->>Prov: Prompt(ctx, agent, messages, options)
    activate Prov
    Prov-->>Resp: Response
    deactivate Prov
    Conv->>Conv: Update messages history
    Conv-->>App: Response
    deactivate Conv
    
    App->>Conv: Messages()
    activate Conv
    Conv-->>App: []Message
    deactivate Conv
    
    App->>Conv: Reset()
    activate Conv
    Conv->>Conv: Clear messages, restore initial
    deactivate Conv
Loading

Estimated code review effort

🎯 3 (Moderate) | ⏱️ ~25 minutes

Possibly related issues

  • #915 — While this issue focuses on environment variable refactoring, both PRs contribute to the framework's foundational infrastructure and may share related CI/build considerations.

Possibly related PRs

  • #868 — Modifies binding infrastructure and contracts/binding configuration patterns that align with the AI binding registration in this PR.
  • #1360 — Extends the Application interface in contracts/foundation/application.go similar to the MakeAI method addition in this PR.
  • #778 — Updates auto-generated mock implementations in mocks/foundation/Application.go, sharing similar mock generation patterns with this PR.

Suggested reviewers

  • devhaozi
  • almas-x

🐰 Behold, dear friends, an AI awakens,
With conversations that sing and reason,
Providers and responses intertwine,
Context flows through contracts so fine,
The framework dances with intelligent grace! 🎭✨

🚥 Pre-merge checks | ✅ 1 | ❌ 4

❌ Failed checks (4 warnings)

Check name Status Explanation Resolution
Title check ⚠️ Warning The PR title references issue #915, but the actual changes implement an AI module instead of the environment variable refactoring described in issue #915. Update the PR title to accurately reflect the changes, such as 'feat: add AI module' or 'feat: [#XXX] add AI module' with the correct issue number that this PR addresses.
Linked Issues check ⚠️ Warning The PR claims to close issue #915 about environment variable refactoring, but implements an AI module instead. None of the coding objectives from issue #915 (splitting constant.go, renaming support variables) are present in the changes. Either implement the actual requirements from issue #915 (split constant.go, rename support.Env/EnvPath/EnvEncryptPath/EnvEncryptCipher) or remove the reference to issue #915 and link to the correct issue for the AI module work.
Out of Scope Changes check ⚠️ Warning The entire changeset is out-of-scope relative to issue #915. The PR introduces a new AI module with contracts, implementations, service provider, facades, and mocks—none of which are mentioned in issue #915's refactoring requirements. Clarify the PR's intent: if adding the AI module is the actual goal, create a new issue for it and update the PR to reference the correct issue; if issue #915 is the real target, replace this changeset with the environment variable refactoring work.
Docstring Coverage ⚠️ Warning Docstring coverage is 52.94% which is insufficient. The required threshold is 80.00%. Write docstrings for the functions missing them to satisfy the coverage threshold.
✅ Passed checks (1 passed)
Check name Status Explanation
Description Check ✅ Passed Check skipped - CodeRabbit’s high-level summary is enabled.

✏️ Tip: You can configure your own custom pre-merge checks in the settings.

✨ Finishing Touches
  • 📝 Generate docstrings (stacked PR)
  • 📝 Generate docstrings (commit on current branch)
🧪 Generate unit tests (beta)
  • Create PR with unit tests
  • Post copyable unit tests in a comment
  • Commit unit tests in branch bowen/#915
📝 Coding Plan
  • Generate coding plan for human review comments

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

Tip

CodeRabbit can approve the review once all CodeRabbit's comments are resolved.

Enable the reviews.request_changes_workflow setting to automatically approve the review once all CodeRabbit's comments are resolved.

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 7

🧹 Nitpick comments (3)
mocks/ai/Ai.go (1)

28-33: Update mock generation template to use any instead of interface{}.

The mocks still emit interface{} in multiple places. Please adjust the mock generation config/template so that regenerated files follow the project Go style rule of using any instead of interface{} (applies to mocks/ai/Ai.go, mocks/ai/AI.go, mocks/ai/Provider.go, mocks/ai/Response.go, mocks/ai/Usage.go, and other generated mock files).

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@mocks/ai/Ai.go` around lines 28 - 33, The generated mocks use interface{}
instead of Go's preferred alias any; update the mock generation template/config
so all occurrences emit any (e.g., change make([]interface{}, len(options)) ->
make([]any, len(options)) and var _ca []interface{} -> var _ca []any) and then
regenerate the mocks (check files like mocks/ai/Ai.go, Ai.go symbols _va,
options, _ca, agent and other generated files such as AI.go, Provider.go,
Response.go, Usage.go to ensure all interface{} instances are replaced with
any).
contracts/ai/response.go (1)

5-17: Consider removing context.Context from getter methods.

The context.Context parameter on Text(), Usage(), Raw(), InputTokens(), OutputTokens(), and TotalTokens() appears unnecessary. These methods typically return cached or computed values from an already-received response. Context is useful for cancellation and deadlines during I/O operations, but these getters should not perform any blocking work.

If these methods are purely returning stored data, removing the context parameter would simplify the API:

type Response interface {
    Text() string
    Usage() Usage
    Raw() any
}

type Usage interface {
    InputTokens() int
    OutputTokens() int
    TotalTokens() int
}
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@contracts/ai/response.go` around lines 5 - 17, The getters on the Response
and Usage interfaces unnecessarily take context.Context; remove the context
parameter from Response.Text, Response.Usage, Response.Raw and from
Usage.InputTokens, Usage.OutputTokens, Usage.TotalTokens, then update all
implementing types and callers to the new no-context signatures (adjust receiver
methods on concrete response/usage structs, update interface assertions and any
places calling these methods), remove unused context imports where applicable,
and run tests/build to ensure all call sites are updated.
mocks/ai/Response.go (1)

27-70: Align the generated Raw helpers with the contract’s any type.

Lines 27, 34, 53, 64, and 69 still emit interface{} even though contracts/ai/response.go already exposes Raw(ctx context.Context) any. Since this file is generated, please fix this in the mockery template/config once so regenerating the AI mocks does not keep reintroducing the legacy spelling here and in the other new mock files.

As per coding guidelines, **/*.go: Use any instead of interface{} when writing Go code`.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@mocks/ai/Response.go` around lines 27 - 70, The generated mock uses the old
Go alias interface{} instead of any; update the mock to use any in the
Response.Raw signature and all helper methods: change the return and parameter
types for Response.Raw(ctx context.Context) (r0 any), the
Response_Raw_Call.Return(_a0 any), the helper Raw(ctx interface{}) (you may keep
the helper param type as interface{} if needed but ensure any is used where it
represents the contract type), and RunAndReturn(run func(context.Context) any)
so the types line up with contracts/ai/response.go; then fix the mockery
template/config so future regenerations emit any instead of interface{} (adjust
the template type mapping or generator flags that control Go 1.18+ alias usage).
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Inline comments:
In `@ai/application.go`:
- Around line 36-43: The WithContext method (Application.WithContext) should
guard against a nil ctx to avoid downstream panics: inside WithContext, check if
the provided ctx is nil and if so substitute context.Background() (or
context.TODO()) before constructing the new Application instance so the returned
AI always has a non-nil ctx; update the function to use that non-nil ctx when
setting the ctx field.
- Around line 46-47: The public method Application.Agent currently returns (nil,
nil) which signals success with a nil conversation and can cause nil
dereferences; update Application.Agent to return a non-nil error when the
operation is unimplemented or fails (e.g., an explicit "not implemented" or
appropriate domain error) instead of returning (nil, nil). Locate the
Application.Agent(func (r *Application) Agent(...)) implementation and replace
the nil-success return with a descriptive error return (keeping the conversation
nil) or implement the actual creation logic so callers never observe (nil, nil).

In `@ai/conversation.go`:
- Around line 22-24: The Prompt method on conversation currently returns nil,
nil and never delegates to the underlying provider; replace the no-op by
invoking the conversation's provider Prompt call (e.g.,
r.provider.Prompt(input)) and return its contractsai.Response and error, and if
r.provider is nil return a meaningful error instead of nil, nil so failures
aren't masked; update conversation.Prompt to forward input to the provider,
handle a nil r.provider (or wrap provider errors) and propagate the provider's
response and error up to the caller.
- Around line 26-31: The Conversation methods are missing implementations:
implement conversation.Messages() to return the conversation's message slice
(e.g. return r.messages) rather than nil, and implement conversation.Reset() to
clear that slice and any related state (e.g. set r.messages = nil or r.messages
= r.messages[:0] and reset any indexes/cursors/metadata on the conversation
struct); if the struct uses synchronization (e.g. r.mu or similar), perform
these reads/writes under the appropriate lock to avoid races.

In `@mocks/ai/Ai.go`:
- Around line 14-16: There is a duplicate type declaration for AI (type AI
struct { mock.Mock }) causing a redeclaration error; locate all declarations of
the AI mock (type name AI and any files in the mocks/ai package that define it),
remove the redundant declaration, and then regenerate mocks so only a single AI
mock type exists (ensure casing-consistent filenames to avoid duplicate
generation across different files).

In `@mocks/ai/AI.go`:
- Around line 1-159: Delete the duplicate autogenerated mock file named
mocks/ai/Ai.go (which collides with mocks/ai/AI.go) so only mocks/ai/AI.go
remains; ensure no other code imports or references the deleted file (tests
should reference the mock type AI and functions like NewAI, AI.Agent,
AI.WithContext from mocks/ai/AI.go) and run the test/build to confirm the
case-insensitive filename collision is resolved.

In `@mocks/ai/Option.go`:
- Around line 1-65: The generated mock incorrectly models the function type
Option as a struct with Execute (symbols: Option, Execute, NewOption); remove
this mock file entirely and update .mockery.yaml to exclude the Option function
(either remove it from generated list or set all: false and explicitly list
types), or if you must keep a mock-like helper, replace usages with a simple
function stub rather than testify/mock structs and use map[string]any (not
map[string]interface{}) where applicable; ensure no other code imports or
references mocks/ai/Option.go before deleting.

---

Nitpick comments:
In `@contracts/ai/response.go`:
- Around line 5-17: The getters on the Response and Usage interfaces
unnecessarily take context.Context; remove the context parameter from
Response.Text, Response.Usage, Response.Raw and from Usage.InputTokens,
Usage.OutputTokens, Usage.TotalTokens, then update all implementing types and
callers to the new no-context signatures (adjust receiver methods on concrete
response/usage structs, update interface assertions and any places calling these
methods), remove unused context imports where applicable, and run tests/build to
ensure all call sites are updated.

In `@mocks/ai/Ai.go`:
- Around line 28-33: The generated mocks use interface{} instead of Go's
preferred alias any; update the mock generation template/config so all
occurrences emit any (e.g., change make([]interface{}, len(options)) ->
make([]any, len(options)) and var _ca []interface{} -> var _ca []any) and then
regenerate the mocks (check files like mocks/ai/Ai.go, Ai.go symbols _va,
options, _ca, agent and other generated files such as AI.go, Provider.go,
Response.go, Usage.go to ensure all interface{} instances are replaced with
any).

In `@mocks/ai/Response.go`:
- Around line 27-70: The generated mock uses the old Go alias interface{}
instead of any; update the mock to use any in the Response.Raw signature and all
helper methods: change the return and parameter types for Response.Raw(ctx
context.Context) (r0 any), the Response_Raw_Call.Return(_a0 any), the helper
Raw(ctx interface{}) (you may keep the helper param type as interface{} if
needed but ensure any is used where it represents the contract type), and
RunAndReturn(run func(context.Context) any) so the types line up with
contracts/ai/response.go; then fix the mockery template/config so future
regenerations emit any instead of interface{} (adjust the template type mapping
or generator flags that control Go 1.18+ alias usage).

ℹ️ Review info
⚙️ Run configuration

Configuration used: Path: .coderabbit.yaml

Review profile: CHILL

Plan: Pro

Run ID: 03877a7b-5651-403a-b5a9-cd5a3a7437d9

📥 Commits

Reviewing files that changed from the base of the PR and between e44c483 and 69c8d8d.

📒 Files selected for processing (27)
  • AGENTS.md
  • ai/application.go
  • ai/conversation.go
  • ai/option.go
  • ai/service_provider.go
  • contracts/ai/agent.go
  • contracts/ai/ai.go
  • contracts/ai/driver.go
  • contracts/ai/message.go
  • contracts/ai/option.go
  • contracts/ai/response.go
  • contracts/binding/binding.go
  • contracts/facades/facades.go
  • contracts/foundation/application.go
  • facades/facades.go
  • foundation/container.go
  • mocks/ai/AI.go
  • mocks/ai/Agent.go
  • mocks/ai/Ai.go
  • mocks/ai/Conversation.go
  • mocks/ai/Option.go
  • mocks/ai/Provider.go
  • mocks/ai/ProviderCreator.go
  • mocks/ai/Response.go
  • mocks/ai/Usage.go
  • mocks/foundation/Application.go
  • support/convert/facade_test.go

@almas-x
Copy link
Contributor

almas-x commented Mar 14, 2026

One API thought: right now WithProvider / WithModel are passed as options into Agent():

conv, err := facades.AI().
    WithContext(ctx).
    Agent(myAgent, ai.WithProvider("openai"), ai.WithModel("gpt-4o"))

Could we move those up to ai.Application as chain methods (like WithContext), and keep Agent() just for picking/starting the agent?

conv, err := facades.AI().
    WithContext(ctx).
    WithProvider("openai").
    WithModel("gpt-4o").
    Agent(myAgent)

Feels a bit cleaner/readable, and matches the idea that provider/model are usually conversation-level config rather than agent-scoped. If there’s a reason they need to stay per-agent, totally fine—just wanted to float the idea.

@hwbrzzl
Copy link
Contributor Author

hwbrzzl commented Mar 14, 2026

One API thought: right now WithProvider / WithModel are passed as options into Agent():

conv, err := facades.AI().
    WithContext(ctx).
    Agent(myAgent, ai.WithProvider("openai"), ai.WithModel("gpt-4o"))

Could we move those up to ai.Application as chain methods (like WithContext), and keep Agent() just for picking/starting the agent?

conv, err := facades.AI().
    WithContext(ctx).
    WithProvider("openai").
    WithModel("gpt-4o").
    Agent(myAgent)

Feels a bit cleaner/readable, and matches the idea that provider/model are usually conversation-level config rather than agent-scoped. If there’s a reason they need to stay per-agent, totally fine—just wanted to float the idea.

@copilot Compare these two styles above.

Copy link
Contributor

Copilot AI commented Mar 14, 2026

@hwbrzzl I've opened a new pull request, #1411, to work on those changes. Once the pull request is ready, I'll request review from you.

@codecov
Copy link

codecov bot commented Mar 14, 2026

Codecov Report

❌ Patch coverage is 0% with 35 lines in your changes missing coverage. Please review.
✅ Project coverage is 67.18%. Comparing base (e44c483) to head (0812f90).
⚠️ Report is 1 commits behind head on master.

Files with missing lines Patch % Lines
ai/service_provider.go 0.00% 11 Missing ⚠️
ai/option.go 0.00% 9 Missing ⚠️
foundation/container.go 0.00% 6 Missing ⚠️
ai/conversation.go 0.00% 5 Missing ⚠️
ai/application.go 0.00% 4 Missing ⚠️
Additional details and impacted files
@@            Coverage Diff             @@
##           master    #1410      +/-   ##
==========================================
- Coverage   67.28%   67.18%   -0.10%     
==========================================
  Files         344      348       +4     
  Lines       25574    25609      +35     
==========================================
- Hits        17208    17206       -2     
- Misses       7638     7674      +36     
- Partials      728      729       +1     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.

@hwbrzzl
Copy link
Contributor Author

hwbrzzl commented Mar 14, 2026

@almas-x I think it's better that the provider and the model belong to agents instead of being set in advance. The normal process when using agents should be: Which agents should be used -> Which model is suitable. Not choose the model first, then agents. And the provider and the model can be set in the agent struct as well (haven't implemented in this PR).

@almas-x
Copy link
Contributor

almas-x commented Mar 14, 2026

@almas-x I think it's better that the provider and the model belong to agents instead of being set in advance. The normal process when using agents should be: Which agents should be used -> Which model is suitable. Not choose the model first, then agents. And the provider and the model can be set in the agent struct as well (haven't implemented in this PR).

Yeah, that makes sense. If provider/model are meant to be agent-owned, pushing them down into the agent struct feels like the cleanest end state.👍

@krishankumar01
Copy link
Member

@hwbrzzl a couple of questions here:

Why does resp.Text(ctx) take a context? By this point conv.Prompt(...) has already returned, so the network call is done. If this is for streaming it makes sense, but then it should return (string, error) not a silent string. Same goes for resp.Usage(ctx), InputTokens(ctx) etc. since those return plain int with no IO, what is the context actually doing here?

And why is context pinned at conversation level via WithContext(ctx) instead of per Prompt call? Each turn is a separate request and can have a different deadline. conv.Prompt(ctx, "...") feels safer and more idiomatic here.

@hwbrzzl
Copy link
Contributor Author

hwbrzzl commented Mar 14, 2026

@krishankumar01 Good catch, updated.

@krishankumar01
Copy link
Member

@hwbrzzl Couple more questions here:

How are we handling streaming responses? conv.Prompt() blocks until the full response is received which can be a problem for long outputs, especially for people integrating with HTTP SSE or any streaming mechanism. Is there a plan for something like conv.Stream(ctx, "...") that returns a channel or io.Reader?

Also how is a provider(default) configured? The examples show WithProvider("openai") but where do the API keys and settings live, is it pulled from Goravel config or registered manually somewhere?

@hwbrzzl
Copy link
Contributor Author

hwbrzzl commented Mar 14, 2026

@krishankumar01 this PR only contains the basic code structure. I will add the first provider and the stream function in the following PRs.

@hwbrzzl hwbrzzl closed this Mar 14, 2026
@hwbrzzl hwbrzzl reopened this Mar 14, 2026
@hwbrzzl hwbrzzl closed this Mar 14, 2026
@hwbrzzl hwbrzzl reopened this Mar 14, 2026
@hwbrzzl
Copy link
Contributor Author

hwbrzzl commented Mar 14, 2026

@goravel/core-developers could you give an approval if there are no questions? Thanks

Copy link
Member

@krishankumar01 krishankumar01 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM. I’ll review it again when we implement these in next PR

@hwbrzzl hwbrzzl merged commit 0cb7658 into master Mar 14, 2026
16 of 18 checks passed
@hwbrzzl hwbrzzl deleted the bowen/#915 branch March 14, 2026 09:12
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants