Label
Please label your issue so that it can easily be easily categorized under LMCache Onboarding
Summary
Are there any roadmap points/plans for torch dependency update, to be in sync with larger neighbour open source projects?
Details
Recently, Kserve HuggingFace server updated their vLLM dependency on master branch from 0.9.2 to 0.11.2 in their images together with torch update from 2.7.0 to 2.9.0:
kserve/kserve@1156129#diff-e6c9e782b66b5c7533e5b0bfa0383346db4dcbdff55cb65438006083267e24b8R56
At the same time they depend on lmcache 0.3.0, which on its turn strictly depends on torch 2.7.0 and may cause issues by unintentional downgrade of torch version to 2.7.0 again.
It seems to make the environment incompatible with torch 2.9.0 & torchvision 0.24.0 & transformers 4.57.1:
https://github.com/kserve/kserve/blob/master/python/kserve/uv.lock#L4226 torch
https://github.com/kserve/kserve/blob/master/python/kserve/uv.lock#L4292 torchvision
https://github.com/kserve/kserve/blob/master/python/kserve/uv.lock#L4328 transformers
vLLM now strictly depends on torch 2.9.0 for a wide range of versions, from 0.11.1 up to recent 0.13.0, according to their pyproject.toml:
https://github.com/vllm-project/vllm/blob/v0.11.1rc2/pyproject.toml
https://github.com/vllm-project/vllm/blob/v0.13.0/pyproject.toml
Steps / Reproduction (if applicable)
If this relates to a reproducible issue or process, list the steps here:
- docker pull image from kserve/huggingfaceserver:latest-gpu
- Use it to deploy a model
- Encounter an error from transformers like this:
Could not import module 'PreTrainedModel' (operator torchvision::nms does not exist)
Expected Outcome / Goal
lmcache is aligned by dependency versions with larger open source LLM frameworks.
Actual Outcome (if applicable)
lmcache may cause dependency conflicts with complex dependency configurations, as a part of larger projects.
Label
Please label your issue so that it can easily be easily categorized under LMCache Onboarding
Summary
Are there any roadmap points/plans for torch dependency update, to be in sync with larger neighbour open source projects?
Details
Recently, Kserve HuggingFace server updated their vLLM dependency on master branch from 0.9.2 to 0.11.2 in their images together with torch update from 2.7.0 to 2.9.0:
kserve/kserve@1156129#diff-e6c9e782b66b5c7533e5b0bfa0383346db4dcbdff55cb65438006083267e24b8R56
At the same time they depend on lmcache 0.3.0, which on its turn strictly depends on torch 2.7.0 and may cause issues by unintentional downgrade of torch version to 2.7.0 again.
It seems to make the environment incompatible with torch 2.9.0 & torchvision 0.24.0 & transformers 4.57.1:
https://github.com/kserve/kserve/blob/master/python/kserve/uv.lock#L4226 torch
https://github.com/kserve/kserve/blob/master/python/kserve/uv.lock#L4292 torchvision
https://github.com/kserve/kserve/blob/master/python/kserve/uv.lock#L4328 transformers
vLLM now strictly depends on torch 2.9.0 for a wide range of versions, from 0.11.1 up to recent 0.13.0, according to their pyproject.toml:
https://github.com/vllm-project/vllm/blob/v0.11.1rc2/pyproject.toml
https://github.com/vllm-project/vllm/blob/v0.13.0/pyproject.toml
Steps / Reproduction (if applicable)
If this relates to a reproducible issue or process, list the steps here:
Could not import module 'PreTrainedModel' (operator torchvision::nms does not exist)
Expected Outcome / Goal
lmcache is aligned by dependency versions with larger open source LLM frameworks.
Actual Outcome (if applicable)
lmcache may cause dependency conflicts with complex dependency configurations, as a part of larger projects.