Skip to content

Fix incorrect normalization axis in v2.ElasticTransform (Fixes #9299)#9300

Merged
NicolasHug merged 9 commits intopytorch:mainfrom
ericjaebeom:fix/elastic-transform-bug
Jan 23, 2026
Merged

Fix incorrect normalization axis in v2.ElasticTransform (Fixes #9299)#9300
NicolasHug merged 9 commits intopytorch:mainfrom
ericjaebeom:fix/elastic-transform-bug

Conversation

@ericjaebeom
Copy link
Copy Markdown
Contributor

@ericjaebeom ericjaebeom commented Dec 3, 2025

This PR fixes a bug where horizontal/vertical displacements were normalized by the wrong spatial dimension (swapped height/width). Validated with reproduction script in Issue #9299. Fixes #9299

cc @vfdev-5

@pytorch-bot
Copy link
Copy Markdown

pytorch-bot bot commented Dec 3, 2025

🔗 Helpful Links

🧪 See artifacts and rendered test results at hud.pytorch.org/pr/pytorch/vision/9300

Note: Links to docs will display an error until the docs builds have been completed.

This comment was automatically generated by Dr. CI and updates every 15 minutes.

@meta-cla
Copy link
Copy Markdown

meta-cla bot commented Dec 3, 2025

Hi @ericjaebeom!

Thank you for your pull request and welcome to our community.

Action Required

In order to merge any pull request (code, docs, etc.), we require contributors to sign our Contributor License Agreement, and we don't seem to have one on file for you.

Process

In order for us to review and merge your suggested changes, please sign at https://code.facebook.com/cla. If you are contributing on behalf of someone else (eg your employer), the individual CLA may not be sufficient and your employer may need to sign the corporate CLA.

Once the CLA is signed, our tooling will perform checks and validations. Afterwards, the pull request will be tagged with CLA signed. The tagging process may take up to 1 hour after signing. Please give it that time before contacting us about it.

If you have received this in error or have any questions, please contact us at cla@meta.com. Thanks!

@meta-cla
Copy link
Copy Markdown

meta-cla bot commented Dec 3, 2025

Thank you for signing our Contributor License Agreement. We can now accept your code for this (and any) Meta Open Source project. Thanks!

@meta-cla meta-cla bot added the cla signed label Dec 3, 2025
@zy1git
Copy link
Copy Markdown
Contributor

zy1git commented Jan 22, 2026

@ericjaebeom Thanks a lot for the issue and PR! I also did some experiments to validate this PR is correct. I also find that for v1, we have the same issue. Could you please also fix that in v1 in this PR?

By the way, could you please write a non-regression test on v2 for this? You can add the test into the class "TestElastic" in the file "test_transforms_v2.py". Feel free to let us know if you have any questions.

@ericjaebeom
Copy link
Copy Markdown
Contributor Author

ericjaebeom commented Jan 22, 2026

Changes

v1 ElasticTransform: Applied the same axis fix.

v2 ElasticTransform: Unpacked height, width from query_size() instead of using size[0], size[1], preventing potential axis misinterpretation. This is consistent with all other usages of query_size in the same file.

On regression testing

I explored adding a non-regression test as requested, but found it difficult to design a meaningful deterministic test for this behavior.

The test would need to verify that ElasticTransform.make_params generates the displacement field with correct width/height normalization. However, since the displacement is derived from random noise, any test would either:

  1. Duplicate the implementation logic: regenerating the expected values with the same seed and formula, which doesn't truly test the semantic correctness:
# Mirrors the implementation exactly
torch.manual_seed(0)
rand_dx = torch.rand(1, 1, height, width) * 2 - 1
rand_dy = torch.rand(1, 1, height, width) * 2 - 1
expected_dx = rand_dx * alpha[0] / width
expected_dy = rand_dy * alpha[1] / height
  1. Rely on statistical properties: e.g. checking that displacement magnitudes follow expected ratios, which requires arbitrary thresholds (such tests would be virtually certain; just less principled):
# Hypothesis test
ratio = torch.std(dy) / torch.std(dx)
assert ratio > 5  # Threshold based on width/height ratio

# Alternatively, validate the bound of dy, dx

The root cause of this bug was conflating (H, W) tensor dimensions with (x, y) coordinates. Using explicitly named height, width variables (rather than size[0], size[1]) should help prevent this class of error going forward.

I'd appreciate your thoughts on whether a regression test is still warranted given these trade-offs.

@zy1git
Copy link
Copy Markdown
Contributor

zy1git commented Jan 23, 2026

Thanks a lot for investigating the non-regression tests. In this case, we think the non-regression tests will not be needed.

Thank you very much for your contributions!

@ericjaebeom
Copy link
Copy Markdown
Contributor Author

Thanks for the review and the guidance!

Copy link
Copy Markdown
Member

@NicolasHug NicolasHug left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks a lot for the PR and for the fix @ericjaebeom !
Thanks @zy1git for the review!

@NicolasHug NicolasHug merged commit b83f490 into pytorch:main Jan 23, 2026
36 of 43 checks passed
@github-actions
Copy link
Copy Markdown

Hey @NicolasHug!

You merged this PR, but no labels were added.
The list of valid labels is available at https://github.com/pytorch/vision/blob/main/.github/process_commit.py

@ericjaebeom ericjaebeom deleted the fix/elastic-transform-bug branch January 23, 2026 17:02
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

v2.ElasticTransform applies incorrect vertical/horizontal normalization on displacement vectors

3 participants