Skip to content

Disabling Aiter Installation in default build#255

Merged
amd-sriram merged 2 commits intorelease/1.7.0from
fix_aiter_llvm_issue_1.7.0
Jul 11, 2025
Merged

Disabling Aiter Installation in default build#255
amd-sriram merged 2 commits intorelease/1.7.0from
fix_aiter_llvm_issue_1.7.0

Conversation

@amd-sriram
Copy link
Copy Markdown
Collaborator

made a flag to switch on/off aiter compile using --aiter when installing apex

fixes https://ontrack-internal.amd.com/browse/SWDEV-542835

Tested it on Docker image :
registry-sc-harbor.amd.com/framework/compute-rocm-dkms-no-npi-hipclang:16387_ubuntu22.04_py3.10_pytorch_lw_rocm7.0_internal_testing_c3f758e0

Setup.py condition Is aiter compiled? Ut condition Backend used Ut status
--aiter yes aiter Fail, expected due to llvm issue
--aiter yes USE_ROCM_AITER_ROPE_BACKEND=0 Native apex pass
--aiter yes USE_ROCM_AITER_ROPE_BACKEND=1 aiter Fail, expected due to llvm issue
no Native apex pass
no USE_ROCM_AITER_ROPE_BACKEND=0 Native apex pass
no USE_ROCM_AITER_ROPE_BACKEND=1 Native apex pass

@amd-sriram amd-sriram self-assigned this Jul 11, 2025
@amd-sriram amd-sriram changed the base branch from master to release/1.7.0 July 11, 2025 14:06
@amd-sriram amd-sriram merged commit 1c50337 into release/1.7.0 Jul 11, 2025
@amd-sriram amd-sriram deleted the fix_aiter_llvm_issue_1.7.0 branch July 11, 2025 14:07
jithunnair-amd pushed a commit to ROCm/pytorch that referenced this pull request Jul 14, 2025
Fixing the C10_warpsize issue. replacing the macros with
at::cuda::warp_size() - ROCm/apex#244

[[release/1.7.0] Added AITER as a submodule and use in
fused_rope.py](ROCm/apex@53f3c64)
- ROCm/apex#226

[Replaced warpsize with
C10_WARP_SIZE](ROCm/apex@f417097)
- ROCm/apex#253

[Disabling Aiter Installation in default build
](ROCm/apex@1c50337)
- ROCm/apex#255

Fixes https://ontrack-internal.amd.com/browse/SWDEV-496182
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant