Skip to content

chore(auth): harden invite-link onboarding token flow#26843

Merged
yuneng-berri merged 3 commits intoBerriAI:litellm_internal_stagingfrom
stuxf:codex/fix-onboarding-invite-token
Apr 30, 2026
Merged

chore(auth): harden invite-link onboarding token flow#26843
yuneng-berri merged 3 commits intoBerriAI:litellm_internal_stagingfrom
stuxf:codex/fix-onboarding-invite-token

Conversation

@stuxf
Copy link
Copy Markdown
Collaborator

@stuxf stuxf commented Apr 30, 2026

Relevant issues

Addresses GHSA-2hg5-37xr-3pgm / invite-link token minting finding.

Pre-Submission checklist

Please complete all items before asking a LiteLLM maintainer to review your PR

  • I have Added testing in the tests/test_litellm/ directory, Adding at least 1 test is a hard requirement - see details
  • My PR passes all unit tests on make test-unit
  • My PR's scope is as isolated as possible, it only solves 1 specific problem
  • I have requested a Greptile review by commenting @greptileai and received a Confidence Score of at least 4/5 before requesting a maintainer review

Type

🐛 Bug Fix
✅ Test

Changes

GET /onboarding/get_token accepted an invitation id and immediately minted a live UI session virtual key for the invited user. If an invite URL leaked, a caller could decode the returned UI JWT and use the embedded sk-... key before completing the password claim flow.

This PR moves key minting out of invite lookup and into the one-time password claim path.

Three changes:

  • GET /onboarding/get_token now returns a 15-minute signed onboarding JWT bound to the invite id and user id. It does not mint a LiteLLM virtual key.
  • POST /onboarding/claim_token requires that onboarding JWT, reserves the invite with update_many(... is_accepted=False ...) inside the password-claim transaction, writes the password, marks the invite accepted, and only then mints the final litellm-dashboard UI session key.
  • The onboarding dashboard stores the final claim response token instead of continuing with the pre-claim onboarding token.

Files

  • Modified: litellm/proxy/proxy_server.py, ui/litellm-dashboard/src/app/onboarding/OnboardingForm.tsx
  • Tests: extends tests/test_litellm/proxy/auth/test_onboarding.py
  • Generated: rebuilt litellm/proxy/_experimental/out/ so the proxy-served static UI matches the dashboard source change

Behaviour notes for operators

  • Opening an invite URL still loads the onboarding page, but the token returned by /onboarding/get_token is no longer a usable LiteLLM key.
  • A leaked invite can only be used to attempt the one-time password claim while the onboarding JWT is valid; successful claim consumes the invite before the final UI session key is minted.

Validation

  • uv run black litellm/proxy/proxy_server.py tests/test_litellm/proxy/auth/test_onboarding.py
  • uv run ruff check litellm/proxy/proxy_server.py tests/test_litellm/proxy/auth/test_onboarding.py
  • uv run pytest tests/test_litellm/proxy/auth/test_onboarding.py -q
  • ./node_modules/.bin/vitest run src/app/onboarding/OnboardingForm.test.tsx
  • NPM_CONFIG_MIN_RELEASE_AGE=0 npm run build
  • cd litellm && uv run mypy .

@stuxf stuxf changed the title fix(security): harden onboarding invite token flow chore(auth): harden invite-link onboarding token flow Apr 30, 2026
@stuxf
Copy link
Copy Markdown
Collaborator Author

stuxf commented Apr 30, 2026

@greptileai

@codecov
Copy link
Copy Markdown

codecov Bot commented Apr 30, 2026

Codecov Report

✅ All modified and coverable lines are covered by tests.

📢 Thoughts on this report? Let us know!

@stuxf stuxf force-pushed the codex/fix-onboarding-invite-token branch from be4ae2e to 8205e24 Compare April 30, 2026 01:55
@stuxf
Copy link
Copy Markdown
Collaborator Author

stuxf commented Apr 30, 2026

@greptileai

@greptile-apps
Copy link
Copy Markdown
Contributor

greptile-apps Bot commented Apr 30, 2026

Greptile Summary

This PR hardens the invite-link onboarding flow to fix GHSA-2hg5-37xr-3pgm: GET /onboarding/get_token now returns a 15-minute signed onboarding JWT instead of a live session key, and POST /onboarding/claim_token requires that JWT before atomically reserving the invite inside a DB transaction and minting the final session key. The rollback helper, frontend token handling, and test suite are all solid. One minor P2 style note below about the transaction condition being narrower than the pre-flight guard.

Confidence Score: 5/5

Safe to merge — the security fix is correctly implemented with proper TOCTOU protection, rollback on key-mint failure, and frontend guard against a missing session token.

No P0 or P1 issues found. All concerns from previous review threads are addressed in the current code. The one new observation is a P2 style inconsistency between the pre-flight check and the transaction guard condition.

No files require special attention.

Important Files Changed

Filename Overview
litellm/proxy/proxy_server.py Core security fix: GET /onboarding/get_token now mints a 15-min onboarding JWT instead of a live session key; POST /onboarding/claim_token uses an atomic update_many + transaction to prevent TOCTOU races; rollback helper resets invite state on key-mint failure
tests/test_litellm/proxy/auth/test_onboarding.py Comprehensive new tests covering missing onboarding token, wrong session binding, invalid bearer, concurrent-reuse prevention, rollback on key-mint failure, and the happy path; mocks are correctly isolated and do not weaken existing coverage
ui/litellm-dashboard/src/app/onboarding/OnboardingForm.tsx Frontend now reads the session token from the claim response (data.token) rather than the pre-claim jwtToken; correctly surfaces a claim error when data.token is absent instead of silently setting an unusable cookie
ui/litellm-dashboard/src/app/onboarding/OnboardingForm.test.tsx New test for missing-token error path validates the cookie is not set and the error message is rendered; existing tests unchanged

Reviews (3): Last reviewed commit: "chore(auth): drop generated dashboard ou..." | Re-trigger Greptile

Comment thread litellm/proxy/proxy_server.py Outdated
Comment thread litellm/proxy/proxy_server.py
Comment thread ui/litellm-dashboard/src/app/onboarding/OnboardingForm.tsx Outdated
@stuxf
Copy link
Copy Markdown
Collaborator Author

stuxf commented Apr 30, 2026

@greptileai

@stuxf stuxf marked this pull request as ready for review April 30, 2026 02:19
@stuxf
Copy link
Copy Markdown
Collaborator Author

stuxf commented Apr 30, 2026

@greptileai

@yuneng-berri yuneng-berri merged commit a9db887 into BerriAI:litellm_internal_staging Apr 30, 2026
43 checks passed
Bojun-Vvibe added a commit to Bojun-Vvibe/oss-contributions that referenced this pull request Apr 30, 2026
- BerriAI/litellm#26843 (merge-after-nits) — invite-link onboarding token hardening

- google-gemini/gemini-cli#26067 (merge-after-nits) — JetBrains alt-buffer respects user setting

- QwenLM/qwen-code#3622 (merge-as-is) — rewind E2E assertion update post isRealUserTurn

- QwenLM/qwen-code#3609 (merge-after-nits) — zero-width-space placeholder unification

- aaif-goose/goose#8796 (merge-as-is) — newSession _meta wire-protocol field rename
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants