Skip to content

Add model parameters temperature and topP to action inputs#168

Merged
stephaniegiang merged 4 commits intoactions:mainfrom
GitPaulo:gitpaulo/fork-add-temperature-topp-params
Feb 4, 2026
Merged

Add model parameters temperature and topP to action inputs#168
stephaniegiang merged 4 commits intoactions:mainfrom
GitPaulo:gitpaulo/fork-add-temperature-topp-params

Conversation

@GitPaulo
Copy link
Contributor

@GitPaulo GitPaulo commented Feb 4, 2026

Description

This PR adds support for configuring the sampling parameters temperature and top-p for model inference.

Note: these parameters can now be set via action inputs or via the YAML prompt configuration, with the YAML config taking precedence.

Related Issues: #38

@GitPaulo GitPaulo requested a review from a team as a code owner February 4, 2026 12:14
Copilot AI review requested due to automatic review settings February 4, 2026 12:14
Copy link
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

Adds support for configuring LLM sampling parameters (temperature and top-p) via GitHub Action inputs, while keeping YAML prompt modelParameters as the higher-precedence source.

Changes:

  • Added temperature and top-p inputs to action.yml.
  • Read and parse temperature / top-p inputs in src/main.ts, with prompt YAML modelParameters taking precedence.
  • Updated README.md and rebuilt dist/index.js to reflect the new inputs/behavior.

Reviewed changes

Copilot reviewed 3 out of 5 changed files in this pull request and generated 1 comment.

File Description
src/main.ts Parses new sampling inputs and forwards them into the inference request with YAML precedence.
dist/index.js Compiled bundle updated to include the new input parsing and request wiring.
action.yml Declares new action inputs temperature and top-p.
README.md Documents the new inputs in the Inputs table.
Comments suppressed due to low confidence (1)

src/main.ts:86

  • New behavior (action inputs temperature / top-p and YAML-precedence) isn’t covered by existing src/main.ts tests. Add test cases to verify: (1) values are parsed and passed through when set via action inputs, (2) YAML modelParameters.temperature/topP override action inputs, and (3) invalid numeric inputs are handled as expected.
    // Get temperature and topP (prompt YAML modelParameters takes precedence over action inputs)
    const temperatureInput = core.getInput('temperature')
    const topPInput = core.getInput('top-p')
    const temperature =
      promptConfig?.modelParameters?.temperature ?? (temperatureInput !== '' ? parseFloat(temperatureInput) : undefined)
    const topP = promptConfig?.modelParameters?.topP ?? (topPInput !== '' ? parseFloat(topPInput) : undefined)

    // Parse custom headers
    const customHeadersInput = core.getInput('custom-headers')
    const customHeaders = parseCustomHeaders(customHeadersInput)

    // Build the inference request with pre-processed messages and response format
    const inferenceRequest = buildInferenceRequest(
      promptConfig,
      systemPrompt,
      prompt,
      modelName,
      temperature,
      topP,

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

@stephaniegiang stephaniegiang merged commit 268593b into actions:main Feb 4, 2026
6 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants