Don't decompose functional ops in predispatch functionalization#116383
Don't decompose functional ops in predispatch functionalization#116383tugsbayasgalan wants to merge 8 commits intogh/tugsbayasgalan/182/basefrom
Conversation
[ghstack-poisoned]
🔗 Helpful Links🧪 See artifacts and rendered test results at hud.pytorch.org/pr/116383
Note: Links to docs will display an error until the docs builds have been completed. ✅ You can merge normally! (1 Unrelated Failure)As of commit 852881b with merge base 36dccc2 ( FLAKY - The following job failed but was likely due to flakiness present on trunk:
This comment was automatically generated by Dr. CI and updates every 15 minutes. |
…ation" [ghstack-poisoned]
…ation" [ghstack-poisoned]
…ation" [ghstack-poisoned]
| torch.ops.aten.dropout.default, # type: ignore[has-type] | ||
| torch.ops.aten.batch_norm.default, # type: ignore[has-type] | ||
| torch.ops.aten.native_batch_norm.default, # type: ignore[has-type] | ||
| torch.ops.aten._batch_norm_impl_index.default, # type: ignore[has-type] |
There was a problem hiding this comment.
Hmm, I think we want every op that is a "maybe-mutating/maybe-aliasing" op to be in this list. That also includes:
aten.reshape
aten.contiguous
aten.cudnn_batch_norm
aten.miopen_batch_norm
...??? (hopefully that's it)
There was a problem hiding this comment.
Can you also add a test, similar to the batch norm test that you have (proving that batch_norm still decomposes with pre-dispatch functionalization), but also for:
reshape() -> should decompose into view()
contiguous -> should be a no-op if the input is already contiguous
There was a problem hiding this comment.
I think reshape and contiguous are already marked as view op. so i think we should be good without adding it to the list. Wrote some test cases to verify it
| torch.ops.prim.device.default, # type: ignore[has-type] | ||
| ] | ||
|
|
||
| # These are ops that claim to be functional, but actually are not |
There was a problem hiding this comment.
Maybe call these "maybe-mutating / maybe-aliasing" ops
We should probably consider adding a tag for these tbh (probably ok not to in this PR)
| # turn off decomp for predispatch right now. Therefore, we do best | ||
| # effort by not decomposing ops that are functional in PreDispatch functionalization | ||
| # for now. | ||
| if self._dispatch_key is not None: |
There was a problem hiding this comment.
nit: maybe make this self._dispatch_key == torch._C._DispatchKey.PreDispatch for clarity (it's kinda annoying that _dispatch_key seems like it can be an arbitrary dispatc key here, but we know it's either None or PreDispatch. Maybe we should just make it a bool)
There was a problem hiding this comment.
Looks good! Just missing the extra "maybe aliasing" ops + extra tests (giving a preemptive stamp).
We talked about the testing strategy for composite op schemas + pre-dispatch tracing more generally, I think it's fine to not do in this PR (but we should probably do it before turning pre-dispatch on by default in export).
…ation" [ghstack-poisoned]
…ation" [ghstack-poisoned]
…ation" [ghstack-poisoned]
…ation" [ghstack-poisoned]
|
@pytorchbot merge |
Merge failedReason: This PR needs a If not, please add the To add a label, you can comment to pytorchbot, for example For more information, see Details for Dev Infra teamRaised by workflow job |
|
@pytorchbot label "topic: not user facing" |
|
@pytorchbot merge |
Merge startedYour change will be merged once all checks pass (ETA 0-4 Hours). Learn more about merging in the wiki. Questions? Feedback? Please reach out to the PyTorch DevX Team |
Stack from ghstack (oldest at bottom):