[Attention] Refactor CUDA attention backend selection logic#24794
[Attention] Refactor CUDA attention backend selection logic#24794mgoin merged 121 commits intovllm-project:mainfrom
Conversation
|
This pull request has merge conflicts that must be resolved before it can be |
LucasWilkinson
left a comment
There was a problem hiding this comment.
left a few comments; we should figure out who the owner of the plugin mechanism is and figure out how to notify downstream HW plugins since I think this will affect them pretty dramatically
Signed-off-by: Matthew Bonanni <mbonanni@redhat.com>
Signed-off-by: Matthew Bonanni <mbonanni@redhat.com>
Signed-off-by: Matthew Bonanni <mbonanni@redhat.com>
|
@LucasWilkinson thanks for your review! I've already notified @ILikeIneine but I'm not sure if there's anyone else we should reach out to? |
|
@MatthewBonanni Hi, would this refactor be able to merge into v0.11.1? |
Signed-off-by: Matthew Bonanni <mbonanni@redhat.com>
|
@ILikeIneine we were planning on waiting until after v0.11.1, we don't want to risk further delaying the release and because it changes the platform interface, it might be better to be part of v0.12 |
Signed-off-by: Matthew Bonanni <mbonanni@redhat.com>
|
@MatthewBonanni how hard would it be to keep backwards compatibility between |
Signed-off-by: Matthew Bonanni <mbonanni@redhat.com>
@LucasWilkinson done in d0f4698 |
|
Discussed offline thanks for the work @MatthewBonanni ! |
| return AttentionBackendEnum[name] | ||
|
|
||
|
|
||
| class _Backend(metaclass=_BackendMeta): |
mgoin
left a comment
There was a problem hiding this comment.
Release has been cut, let's go for it on main
|
The merge commit of this PR failed |
Purpose
CudaPlatformBase.get_attention_backend_clshas gotten complex and messy over time. This PR cleans up the logic (without changing the behavior) and standardizes the interface.Test Plan
Test Result
Essential Elements of an Effective PR Description Checklist
supported_models.mdandexamplesfor a new model.