Skip to content

fix: bump torchao lower bound to 0.16.0 for torch 2.10.0+cu128 compat#974

Merged
ChuxiJ merged 2 commits intoace-step:mainfrom
HugoO612:fix/torchao-cu128-compatibility
Mar 30, 2026
Merged

fix: bump torchao lower bound to 0.16.0 for torch 2.10.0+cu128 compat#974
ChuxiJ merged 2 commits intoace-step:mainfrom
HugoO612:fix/torchao-cu128-compatibility

Conversation

@HugoO612
Copy link
Copy Markdown
Contributor

@HugoO612 HugoO612 commented Mar 29, 2026

Summary

Bumps torchao minimum version from 0.14.1 to 0.16.0 to resolve C++ extension
skip when using torch 2.10.0+cu128 (the version resolved by uv from the cu128 index).

Root cause: torchao 0.15.0 was built against torch 2.9.1. When the cu128 index
resolves torch==2.10.0, torchao skips all C++ extensions at import time.
Reference: pytorch/ao#2919

Closes #98

Scope

  • Files changed: pyproject.toml
  • Out of scope: runtime code, other dependency changes

Risk and Compatibility

  • Target: Linux CUDA cu128 users
  • aarch64 path: unchanged (separate torchao line untouched)
  • macOS/ROCm/XPU: unaffected, no torchao on those paths
  • Non-target platforms unchanged ✅

Regression Checks

  • GPU: NVIDIA RTX 4060 Laptop (8GB), WSL2 Ubuntu, Python 3.11
  • torch==2.10.0+cu128, torchao==0.16.0+cu128
  • uv run acestep launches successfully, no cpp extension warning
  • Gradio UI accessible at http://127.0.0.1:7860

Reviewer Notes

  • torchao 0.16.0 officially targets torch 2.10.0 per pytorch/ao compatibility table
  • Upper bound <0.17.0 retained to avoid unreviewed breaking changes

Summary by CodeRabbit

  • Chores
    • Updated dependency version requirements to improve compatibility.

@coderabbitai
Copy link
Copy Markdown
Contributor

coderabbitai Bot commented Mar 29, 2026

No actionable comments were generated in the recent review. 🎉

ℹ️ Recent review info
⚙️ Run configuration

Configuration used: Organization UI

Review profile: CHILL

Plan: Pro

Run ID: 4d5c7b2b-85c5-4798-beb1-13ddadce18d9

📥 Commits

Reviewing files that changed from the base of the PR and between f1f8f6c and b6f318c.

📒 Files selected for processing (1)
  • pyproject.toml

📝 Walkthrough

Walkthrough

Updated the torchao dependency version constraint for non-aarch64 platforms from >=0.14.1,<0.16.0 to >=0.16.0,<0.17.0 in the project configuration. The aarch64-specific dependency entry remains unchanged.

Changes

Cohort / File(s) Summary
Dependency Update
pyproject.toml
Bumped torchao version range for non-aarch64 platforms to >=0.16.0,<0.17.0 to address torch compatibility issues.

Estimated code review effort

🎯 1 (Trivial) | ⏱️ ~2 minutes

Possibly related PRs

Poem

🐰 A hop, skip, and version bump away,
Torchao's now compatible to stay,
0.16 to 0.17 we dance,
No more errors by any chance!

🚥 Pre-merge checks | ✅ 5
✅ Passed checks (5 passed)
Check name Status Explanation
Description Check ✅ Passed Check skipped - CodeRabbit’s high-level summary is enabled.
Title check ✅ Passed The title accurately summarizes the main change: bumping torchao's lower bound from 0.14.1 to 0.16.0 for torch 2.10.0+cu128 compatibility.
Linked Issues check ✅ Passed The PR directly addresses issue #98 by ensuring torch and torchao versions are compatible, preventing C++ extension import skipping that was causing the reported incompatibility error message.
Out of Scope Changes check ✅ Passed The PR modifies only pyproject.toml dependency constraints as intended; no runtime code changes, configuration changes, or unrelated modifications are present.
Docstring Coverage ✅ Passed No functions found in the changed files to evaluate docstring coverage. Skipping docstring coverage check.

✏️ Tip: You can configure your own custom pre-merge checks in the settings.

✨ Finishing Touches
🧪 Generate unit tests (beta)
  • Create PR with unit tests

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

Copy link
Copy Markdown
Contributor

@ChuxiJ ChuxiJ left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review — PR #974

LGTM. This is a clean, minimal dependency bump.

What it does:

  • Bumps torchao lower bound from 0.14.1 to 0.16.0 for non-aarch64 platforms
  • Retains <0.17.0 upper bound to avoid unreviewed breaking changes
  • Only touches pyproject.toml — no runtime code changes

Why it's correct:

  • torchao 0.15.0 was built against torch 2.9.1; when the cu128 index resolves torch==2.10.0, torchao skips all C++ extensions at import time (ref: pytorch/ao#2919)
  • torchao 0.16.0 officially targets torch 2.10.0 per the pytorch/ao compatibility table
  • aarch64 path is unchanged (separate unconstrained torchao line)
  • macOS/ROCm/XPU unaffected

Verification:

  • Author tested on RTX 4060 Laptop (8GB), WSL2 Ubuntu, Python 3.11 with torch==2.10.0+cu128, torchao==0.16.0+cu128 — no cpp extension warning, Gradio UI accessible

Minor note:

  • The Windows pyproject.toml line still pins torch==2.7.1+cu128, so Windows users won't be affected by this change at all since torchao is gated on platform_machine != 'aarch64' but not on sys_platform. This is fine — Windows torch is 2.7.1 which is compatible with the existing torchao range, and >=0.16.0 should still resolve correctly for Windows installs.

Approved — ready to merge.

@HugoO612
Copy link
Copy Markdown
Contributor Author

@ChuxiJ, CI is pending approval to run. Could you approve and run it so we can get this merged? Thanks!

@ChuxiJ ChuxiJ merged commit fd92d96 into ace-step:main Mar 30, 2026
2 checks passed
@HugoO612 HugoO612 deleted the fix/torchao-cu128-compatibility branch March 30, 2026 08:56
ChuxiJ added a commit that referenced this pull request Mar 31, 2026
PR #974 bumped torchao from <0.16.0 to >=0.16.0, but diffusers was
left unpinned. torchao 0.16.0 removed the
`torchao.dtypes.uintx.uint4_layout` module, which diffusers 0.36.0
tries to import at module level. This causes model initialization to
fail with `NameError: name 'logger' is not defined`.

diffusers 0.37.0+ handles this correctly by gating the import behind
a torchao version check (huggingface/diffusers#11018).

Fixes #982

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
ChuxiJ added a commit that referenced this pull request Mar 31, 2026
PR #974 bumped torchao from <0.16.0 to >=0.16.0, but diffusers was
left unpinned. torchao 0.16.0 removed the
`torchao.dtypes.uintx.uint4_layout` module, which diffusers 0.36.0
tries to import at module level. This causes model initialization to
fail with `NameError: name 'logger' is not defined`.

diffusers 0.37.0+ handles this correctly by gating the import behind
a torchao version check (huggingface/diffusers#11018).

Fixes #982

Co-authored-by: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Pytorch incompatible with Torchao

2 participants