[ROCM] Properly disable Flash Attention/Efficient Attention with environment variables#133866
[ROCM] Properly disable Flash Attention/Efficient Attention with environment variables#133866xinyazhang wants to merge 4 commits intopytorch:mainfrom
Conversation
🔗 Helpful Links🧪 See artifacts and rendered test results at hud.pytorch.org/pr/133866
Note: Links to docs will display an error until the docs builds have been completed. ✅ You can merge normally! (3 Unrelated Failures)As of commit 298fee6 with merge base df68315 ( FLAKY - The following job failed but was likely due to flakiness present on trunk:
BROKEN TRUNK - The following jobs failed but was present on the merge base:👉 Rebase onto the `viable/strict` branch to avoid these failures
This comment was automatically generated by Dr. CI and updates every 15 minutes. |
jithunnair-amd
left a comment
There was a problem hiding this comment.
LGTM. @xinyazhang Have you done any local testing with a build that has them disabled?
I run a local |
|
CI build failure is real. |
@jeffdaily It is expected if AOTriton installation doesn't exist (which I think it's the case for the CI image) Both should be OFF to disable AOTriton. |
|
Okay I think I found the problem. Within |
Otherwise we have circular dependencies.
e767d4b to
693d4e3
Compare
|
@malfet , |
|
@pytorchbot merge |
Merge startedYour change will be merged once all checks pass (ETA 0-4 Hours). Learn more about merging in the wiki. Questions? Feedback? Please reach out to the PyTorch DevX Team |
|
The merge job was canceled or timed out. This most often happen if two merge requests were issued for the same PR, or if merge job was waiting for more than 6 hours for tests to finish. In later case, please do not hesitate to reissue the merge command |
|
@jithunnair-amd it seems the merge was blocked by failing CI, which is supposed be fixed by #133884 |
|
@pytorchbot merge -f "Build issues resolved. This PR is for build scenarios not relevant to CI. Test failures are related to GQA which is addressed in #133884." |
Merge startedYour change will be merged immediately since you used the force (-f) flag, bypassing any CI checks (ETA: 1-5 minutes). Please use Learn more about merging in the wiki. Questions? Feedback? Please reach out to the PyTorch DevX Team |
…tion with environment variables (#1570) Now `USE_FLASH_ATTENTION=0 USE_MEM_EFF_ATTENTION=0 python setup.py` can compile correctly. This is cherry-picked version of pytorch#133866
…tion with environment variables (#1571) Now `USE_FLASH_ATTENTION=0 USE_MEM_EFF_ATTENTION=0 python setup.py` can compile correctly. This is cherry-picked version of pytorch#133866 --------- Co-authored-by: Pruthvi Madugundu <pruthvigithub@gmail.com>
…ronment variables (#1542) Now `USE_FLASH_ATTENTION=0 USE_MEM_EFF_ATTENTION=0 python setup.py` can compile correctly. This is cherry-picked version of pytorch#133866 Tested with `USE_FLASH_ATTENTION=0 USE_MEM_EFF_ATTENTION=0 python setup.py develop --user` and `python -c 'import torch'`
…ronment variables (pytorch#133866) Now `USE_FLASH_ATTENTION=0 USE_MEM_EFF_ATTENTION=0 python setup.py` can compile correctly Fixes pytorch#125230 Pull Request resolved: pytorch#133866 Approved by: https://github.com/jithunnair-amd, https://github.com/jeffdaily, https://github.com/malfet
…tion with environment variables (#1570) Now `USE_FLASH_ATTENTION=0 USE_MEM_EFF_ATTENTION=0 python setup.py` can compile correctly. This is cherry-picked version of pytorch#133866
Now
USE_FLASH_ATTENTION=0 USE_MEM_EFF_ATTENTION=0 python setup.pycan compile correctlyFixes #125230
cc @jeffdaily @sunway513 @jithunnair-amd @pruthvistony @ROCmSupport @dllehr-amd @jataylo @hongxiayang