Remove code and logic for old style custom autograd Function #57357
Remove code and logic for old style custom autograd Function #57357soulitzer wants to merge 5 commits intopytorch:masterfrom
Conversation
💊 CI failures summary and remediationsAs of commit 0e2845c (more details on the Dr. CI page):
2 failures not recognized by patterns:
This comment was automatically generated by Dr. CI (expand for details).Follow this link to opt-out of these comments for your Pull Requests.Please report bugs/suggestions to the (internal) Dr. CI Users group. |
ef4cb8c to
390cc44
Compare
Codecov Report
@@ Coverage Diff @@
## master #57357 +/- ##
==========================================
- Coverage 77.72% 77.65% -0.07%
==========================================
Files 1957 1956 -1
Lines 195142 194759 -383
==========================================
- Hits 151665 151234 -431
- Misses 43477 43525 +48 |
albanD
left a comment
There was a problem hiding this comment.
This looks good from a quick read.
Just a question if we could restrict even more the creation of Function.
There was a problem hiding this comment.
WOW That is sketchy !!! haha
There was a problem hiding this comment.
Why can't we just forbid doing Id() altogether?
I'm sure that we can make sure that we can create the ctx internally without ever calling the torch.autograd.Function.__init__
There was a problem hiding this comment.
This might be risky because we never raised a warning for this particular code path. There is a probably a lot of code that does instance = MyFunc(); instance.apply(args). I found several cases in test_autograd that does this.
There was a problem hiding this comment.
I've updated it to raise a warning when someone does Id() so we can remove it in a future version
8392f6d to
0ab98da
Compare
|
@soulitzer has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator. |
|
@soulitzer has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator. |
|
@soulitzer merged this pull request in 710a83d. |
…#57357) Summary: Fixes pytorch#30696 ### Release Notes Instantiating a custom autograd function is now deprecated. Users should call `.apply()` on the class itself because it is a static method. --end release notes-- - There are a couple error messages that we can't entirely remove because accessing these attributes of the autograd function instance may segfault (due to cdata being nullptr). Also added a TORCH_CHECK for the name attribute which previously segfaulted. - Error message updated to convey 1) old-style functions have been deprecated 2) this access pattern was once valid - Updates variable -> Tensor for some error messages Pull Request resolved: pytorch#57357 Reviewed By: mrshenli Differential Revision: D28193095 Pulled By: soulitzer fbshipit-source-id: f021b105e9a3fd4a20d6ee3dfb6a06a8c34b10ca
…#57357) Summary: Fixes pytorch#30696 ### Release Notes Instantiating a custom autograd function is now deprecated. Users should call `.apply()` on the class itself because it is a static method. --end release notes-- - There are a couple error messages that we can't entirely remove because accessing these attributes of the autograd function instance may segfault (due to cdata being nullptr). Also added a TORCH_CHECK for the name attribute which previously segfaulted. - Error message updated to convey 1) old-style functions have been deprecated 2) this access pattern was once valid - Updates variable -> Tensor for some error messages Pull Request resolved: pytorch#57357 Reviewed By: mrshenli Differential Revision: D28193095 Pulled By: soulitzer fbshipit-source-id: f021b105e9a3fd4a20d6ee3dfb6a06a8c34b10ca
Fixes #30696
Release Notes
Instantiating a custom autograd function is now deprecated. Users should call
.apply()on the class itself because it is a static method.--end release notes--