Skip to content

[serve][llm] update vllm_engine.py to check for VLLM_USE_V1 attribute#58820

Merged
kouroshHakha merged 1 commit intoray-project:masterfrom
eloaf:patch-1
Nov 20, 2025
Merged

[serve][llm] update vllm_engine.py to check for VLLM_USE_V1 attribute#58820
kouroshHakha merged 1 commit intoray-project:masterfrom
eloaf:patch-1

Conversation

@eloaf
Copy link
Copy Markdown
Contributor

@eloaf eloaf commented Nov 19, 2025

Description

vLLM 0.11.1 has removed the use vLLM v1 flag, and this causes serve to crash when using LLMConfig.

Related issues

TODO: I will look for related or open an issue.

Additional information

Are there additional checks that should happen?

@kouroshHakha kouroshHakha added the go add ONLY when ready to merge, run all tests label Nov 19, 2025
@kouroshHakha kouroshHakha changed the title Update vllm_engine.py to check for VLLM_USE_V1 attribute [serve][llm] update vllm_engine.py to check for VLLM_USE_V1 attribute Nov 19, 2025
@kouroshHakha kouroshHakha marked this pull request as ready for review November 19, 2025 17:06
@kouroshHakha kouroshHakha requested a review from a team as a code owner November 19, 2025 17:06
@kouroshHakha kouroshHakha enabled auto-merge (squash) November 19, 2025 17:06
@github-actions github-actions bot disabled auto-merge November 19, 2025 17:39
@ray-gardener ray-gardener bot added serve Ray Serve Related Issue llm community-contribution Contributed by the community labels Nov 19, 2025
Signed-off-by: Eric Laufer <thiboeri@gmail.com>
@kouroshHakha kouroshHakha merged commit 0621b26 into ray-project:master Nov 20, 2025
6 checks passed
400Ping pushed a commit to 400Ping/ray that referenced this pull request Nov 21, 2025
@kevin-bates
Copy link
Copy Markdown

@nrghosh, @eicherseiji, @kouroshHakha - thank you for maintaining this repository!

I suspect this didn't make the 2.52.0 release because it was merged a tad too close to the release. Might there be a chance this could be included in the next patch release? If so, is there an eta for that?

ykdojo pushed a commit to ykdojo/ray that referenced this pull request Nov 27, 2025
…ray-project#58820)

Signed-off-by: Eric Laufer <thiboeri@gmail.com>
Signed-off-by: YK <1811651+ykdojo@users.noreply.github.com>
SheldonTsen pushed a commit to SheldonTsen/ray that referenced this pull request Dec 1, 2025
@nrghosh
Copy link
Copy Markdown
Contributor

nrghosh commented Dec 1, 2025

@nrghosh, @eicherseiji, @kouroshHakha - thank you for maintaining this repository!

I suspect this didn't make the 2.52.0 release because it was merged a tad too close to the release. Might there be a chance this could be included in the next patch release? If so, is there an eta for that?

Hi @kevin-bates! 2.52.1 was released a few days ago, I believe this commit is included: ray-2.52.1...master

Cheers

@kpal-lilt
Copy link
Copy Markdown

kpal-lilt commented Dec 7, 2025

@nrghosh I don't think this made it into the 2.52.1, I can neither see it under 2.52.1 tagged tree nor in the list you posted.

Do we expect a new bug-fix such as 2.52.2 soon? I am not aware of any workaround here.

@nrghosh
Copy link
Copy Markdown
Contributor

nrghosh commented Dec 8, 2025

Hi @kpal-lilt - as a workaround you could use a nightly image

@kpal-lilt
Copy link
Copy Markdown

@nrghosh Thanks for the advice, that's what I was trying, but nightly builds are a bet and definitely not for production. The yesterdays nightly builds didn't work for me because it broke other APIs :/ Do we have an insight if a new fix release will happen for 2.52?

richardliaw pushed a commit that referenced this pull request Dec 8, 2025
Related prs that we should review when upgrading fully: 
- #58820
- Note from Rui: When we bump new vllm version, we should go with 0.11.2
instead of 0.11.1, which fixes a Ray multi-node PP regression that was
introduced when adding torch-based PP
https://github.com/vllm-project/vllm/releases/tag/v0.11.2

Issues:
- closes #58937
- closes #58973
- closes #58702

---------

Signed-off-by: Kourosh Hakhamaneshi <Kourosh@anyscale.com>
Signed-off-by: Seiji Eicher <seiji@anyscale.com>
Signed-off-by: Nikhil Ghosh <nikhil@anyscale.com>
Signed-off-by: Nikhil G <nrghosh@users.noreply.github.com>
Signed-off-by: elliot-barn <elliot.barnwell@anyscale.com>
Co-authored-by: Seiji Eicher <seiji@anyscale.com>
Co-authored-by: Nikhil Ghosh <nikhil@anyscale.com>
Co-authored-by: Nikhil G <nrghosh@users.noreply.github.com>
Co-authored-by: elliot-barn <elliot.barnwell@anyscale.com>
peterxcli pushed a commit to peterxcli/ray that referenced this pull request Feb 25, 2026
…ray-project#58820)

Signed-off-by: Eric Laufer <thiboeri@gmail.com>
Signed-off-by: peterxcli <peterxcli@gmail.com>
peterxcli pushed a commit to peterxcli/ray that referenced this pull request Feb 25, 2026
Related prs that we should review when upgrading fully:
- ray-project#58820
- Note from Rui: When we bump new vllm version, we should go with 0.11.2
instead of 0.11.1, which fixes a Ray multi-node PP regression that was
introduced when adding torch-based PP
https://github.com/vllm-project/vllm/releases/tag/v0.11.2

Issues:
- closes ray-project#58937
- closes ray-project#58973
- closes ray-project#58702

---------

Signed-off-by: Kourosh Hakhamaneshi <Kourosh@anyscale.com>
Signed-off-by: Seiji Eicher <seiji@anyscale.com>
Signed-off-by: Nikhil Ghosh <nikhil@anyscale.com>
Signed-off-by: Nikhil G <nrghosh@users.noreply.github.com>
Signed-off-by: elliot-barn <elliot.barnwell@anyscale.com>
Co-authored-by: Seiji Eicher <seiji@anyscale.com>
Co-authored-by: Nikhil Ghosh <nikhil@anyscale.com>
Co-authored-by: Nikhil G <nrghosh@users.noreply.github.com>
Co-authored-by: elliot-barn <elliot.barnwell@anyscale.com>
Signed-off-by: peterxcli <peterxcli@gmail.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

community-contribution Contributed by the community go add ONLY when ready to merge, run all tests llm serve Ray Serve Related Issue

Projects

None yet

Development

Successfully merging this pull request may close these issues.

6 participants