[aotd] Support mutations in reordering_to_mimic_autograd_engine#155353
[aotd] Support mutations in reordering_to_mimic_autograd_engine#155353IvanKobzarev wants to merge 2 commits intogh/IvanKobzarev/113/basefrom
Conversation
[ghstack-poisoned]
🔗 Helpful Links🧪 See artifacts and rendered test results at hud.pytorch.org/pr/155353
Note: Links to docs will display an error until the docs builds have been completed. ✅ You can merge normally! (1 Unrelated Failure)As of commit 8ff6544 with merge base ea5b9ec ( UNSTABLE - The following job is marked as unstable, possibly due to flakiness on trunk:
This comment was automatically generated by Dr. CI and updates every 15 minutes. |
test/functorch/test_aotdispatch.py
Outdated
| torch.compile(fn, backend="aot_eager", fullgraph=True)( | ||
| dummy, inplace | ||
| ).sum().backward() | ||
| self.assertEqual(ref, inplace) |
There was a problem hiding this comment.
can we have the test assert that the inputs are correct, both:
(1) after running the compiled (and reference) forwards, but before running the backward
(2) after running the backward?
That should help ensure that the test actually confirms that we are not e.g. moving the backward mutation into the forward graph, too
…ngine" Original issue: #154820 Dedicated sub-issue: #155242 Backward graph is reordered by partitioners.py: reordering_to_mimic_autograd_engine Which only records in the backward graph compute that starts from tangents. Mutation of primals(inputs) in backward can be disconnected from backward. Handling this copy_ specifically, as we add this mutation in framework and this is the only mutation that exist. [ghstack-poisoned]
|
@pytorchbot merge |
Merge startedYour change will be merged once all checks pass (ETA 0-4 Hours). Learn more about merging in the wiki. Questions? Feedback? Please reach out to the PyTorch DevX Team |
Merge failedReason: 1 jobs have failed, first few of them are: trunk / linux-jammy-rocm-py3.10 / test (distributed, 1, 1, linux.rocm.gpu.4) Details for Dev Infra teamRaised by workflow job |
|
@pytorchbot merge |
Merge startedYour change will be merged once all checks pass (ETA 0-4 Hours). Learn more about merging in the wiki. Questions? Feedback? Please reach out to the PyTorch DevX Team |
Stack from ghstack (oldest at bottom):
Original issue: #154820
Dedicated sub-issue: #155242
Backward graph is reordered by partitioners.py: reordering_to_mimic_autograd_engine
Which only records in the backward graph compute that starts from tangents.
Mutation of primals(inputs) in backward can be disconnected from backward.
Handling this copy_ specifically, as we add this mutation in framework and this is the only mutation that exist.