Skip to content

feat: support modelParameters in prompt.yaml files#148

Merged
sgoedecke merged 3 commits intoactions:mainfrom
dsanders11:feat/prompt-yaml-model-parameters
Nov 24, 2025
Merged

feat: support modelParameters in prompt.yaml files#148
sgoedecke merged 3 commits intoactions:mainfrom
dsanders11:feat/prompt-yaml-model-parameters

Conversation

@dsanders11
Copy link
Contributor

Adds support for the modelParameters property as shown in https://github.com/github/gh-models/blob/main/examples/advanced_template_prompt.yml, and defined here.

I made it so that if maxTokens is defined in modelParameters, it takes precedence over the input max-tokens input for the action.

cc @sgoedecke

@dsanders11 dsanders11 requested a review from a team as a code owner November 24, 2025 00:13
Copilot AI review requested due to automatic review settings November 24, 2025 00:13
Copy link
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

This PR adds support for the modelParameters configuration in prompt.yaml files, allowing users to specify maxTokens, temperature, and topP parameters directly in their prompt configuration files. When maxTokens is defined in modelParameters, it takes precedence over the max-tokens action input.

Key changes:

  • Added ModelParameters interface with optional maxTokens, temperature, and topP properties
  • Updated PromptConfig to include optional modelParameters field
  • Modified parameter extraction logic to prioritize modelParameters values over action inputs

Reviewed changes

Copilot reviewed 5 out of 7 changed files in this pull request and generated 3 comments.

Show a summary per file
File Description
src/prompt.ts Defines the new ModelParameters interface and integrates it into PromptConfig
src/main.ts Implements precedence logic for maxTokens and extracts temperature and topP from modelParameters
src/inference.ts Adds temperature and topP to the InferenceRequest interface and passes temperature to API calls
src/helpers.ts Updates buildInferenceRequest signature to accept and pass through temperature and topP parameters
dist/index.js Compiled output reflecting the source code changes
tests/helpers-inference.test.ts Updates existing tests to include undefined values for the new temperature and topP parameters

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

@dsanders11 dsanders11 force-pushed the feat/prompt-yaml-model-parameters branch from 972c98e to 48f0ede Compare November 24, 2025 00:17
dsanders11 and others added 2 commits November 23, 2025 16:19
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
@sgoedecke sgoedecke merged commit 5022b33 into actions:main Nov 24, 2025
6 checks passed
@sgoedecke
Copy link
Contributor

LGTM, thanks!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants