[vLLM] Update xformers and remove flashinfer-python#168141
[vLLM] Update xformers and remove flashinfer-python#168141
Conversation
Signed-off-by: Huy Do <huydhn@gmail.com>
🔗 Helpful Links🧪 See artifacts and rendered test results at hud.pytorch.org/pr/168141
Note: Links to docs will display an error until the docs builds have been completed. ✅ You can merge normally! (5 Unrelated Failures)As of commit d79ccd0 with merge base 13ec55d ( FLAKY - The following jobs failed but were likely due to flakiness present on trunk:
This comment was automatically generated by Dr. CI and updates every 15 minutes. |
Signed-off-by: Huy Do <huydhn@gmail.com>
yangw-dev
left a comment
There was a problem hiding this comment.
LGTM!
thx for remove flashifer
|
@pytorchbot merge -f 'No need to run trunk jobs' |
Merge startedYour change will be merged immediately since you used the force (-f) flag, bypassing any CI checks (ETA: 1-5 minutes). Please use Learn more about merging in the wiki. Questions? Feedback? Please reach out to the PyTorch DevX Team |
A couple of changes:
xformers==0.0.33.post1. This is the latest version for 2.9 releaseflashinfer-pythonbuild, we don't need to compile it anymore after [UX] Add FlashInfer as default CUDA dependency vllm-project/vllm#26443. This is now a regular dependency for vLLMTesting
https://github.com/pytorch/pytorch/actions/runs/19490188972/job/55780754518