Skip to content

Remove code and logic for old style custom autograd Function #57357

Closed
soulitzer wants to merge 5 commits intopytorch:masterfrom
soulitzer:old-custom-fn
Closed

Remove code and logic for old style custom autograd Function #57357
soulitzer wants to merge 5 commits intopytorch:masterfrom
soulitzer:old-custom-fn

Conversation

@soulitzer
Copy link
Copy Markdown
Contributor

@soulitzer soulitzer commented Apr 30, 2021

Fixes #30696

Release Notes

Instantiating a custom autograd function is now deprecated. Users should call .apply() on the class itself because it is a static method.

--end release notes--

  • There are a couple error messages that we can't entirely remove because accessing these attributes of the autograd function instance may segfault (due to cdata being nullptr). Also added a TORCH_CHECK for the name attribute which previously segfaulted.
  • Error message updated to convey 1) old-style functions have been deprecated 2) this access pattern was once valid
  • Updates variable -> Tensor for some error messages

@facebook-github-bot
Copy link
Copy Markdown
Contributor

facebook-github-bot commented Apr 30, 2021

💊 CI failures summary and remediations

As of commit 0e2845c (more details on the Dr. CI page):


  • 3/3 failures possibly* introduced in this PR
    • 1/3 non-scanned failure(s)

2 failures not recognized by patterns:

Job Step Action
CircleCI pytorch_vulkan_linux_bionic_py3_6_clang9_build (Optional) Merge target branch 🔁 rerun
CircleCI pytorch_linux_bionic_py3_6_clang9_noarch_build (Optional) Merge target branch 🔁 rerun

This comment was automatically generated by Dr. CI (expand for details).Follow this link to opt-out of these comments for your Pull Requests.

Please report bugs/suggestions to the (internal) Dr. CI Users group.

Click here to manually regenerate this comment.

@soulitzer soulitzer force-pushed the old-custom-fn branch 4 times, most recently from ef4cb8c to 390cc44 Compare April 30, 2021 20:29
@soulitzer soulitzer marked this pull request as ready for review April 30, 2021 20:30
@soulitzer soulitzer requested a review from albanD as a code owner April 30, 2021 20:30
@codecov
Copy link
Copy Markdown

codecov Bot commented Apr 30, 2021

Codecov Report

Merging #57357 (0ab98da) into master (7fe4c1d) will decrease coverage by 0.06%.
The diff coverage is 100.00%.

❗ Current head 0ab98da differs from pull request most recent head 2b09045. Consider uploading reports for the commit 2b09045 to get more accurate results

@@            Coverage Diff             @@
##           master   #57357      +/-   ##
==========================================
- Coverage   77.72%   77.65%   -0.07%     
==========================================
  Files        1957     1956       -1     
  Lines      195142   194759     -383     
==========================================
- Hits       151665   151234     -431     
- Misses      43477    43525      +48     

Copy link
Copy Markdown
Collaborator

@albanD albanD left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This looks good from a quick read.
Just a question if we could restrict even more the creation of Function.

Comment thread test/test_autograd.py Outdated
Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

WOW That is sketchy !!! haha

Comment thread test/test_autograd.py Outdated
Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why can't we just forbid doing Id() altogether?
I'm sure that we can make sure that we can create the ctx internally without ever calling the torch.autograd.Function.__init__

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This might be risky because we never raised a warning for this particular code path. There is a probably a lot of code that does instance = MyFunc(); instance.apply(args). I found several cases in test_autograd that does this.

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I've updated it to raise a warning when someone does Id() so we can remove it in a future version

@soulitzer soulitzer force-pushed the old-custom-fn branch 2 times, most recently from 8392f6d to 0ab98da Compare May 3, 2021 16:52
@soulitzer soulitzer requested a review from albanD May 3, 2021 16:53
Copy link
Copy Markdown
Collaborator

@albanD albanD left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nice!
Thanks!

@facebook-github-bot
Copy link
Copy Markdown
Contributor

@soulitzer has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.

Copy link
Copy Markdown
Collaborator

@albanD albanD left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for the update!

@facebook-github-bot
Copy link
Copy Markdown
Contributor

@soulitzer has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.

@facebook-github-bot
Copy link
Copy Markdown
Contributor

@soulitzer merged this pull request in 710a83d.

krshrimali pushed a commit to krshrimali/pytorch that referenced this pull request May 19, 2021
…#57357)

Summary:
Fixes pytorch#30696

### Release Notes
Instantiating a custom autograd function is now deprecated. Users should call `.apply()` on the class itself because it is a static method.

--end release notes--
 - There are a couple error messages that we can't entirely remove because accessing these attributes of the autograd function instance may segfault (due to cdata being nullptr). Also added a TORCH_CHECK for the name attribute which previously segfaulted.
 - Error message updated to convey 1) old-style functions have been deprecated 2) this access pattern was once valid
 - Updates variable -> Tensor for some error messages

Pull Request resolved: pytorch#57357

Reviewed By: mrshenli

Differential Revision: D28193095

Pulled By: soulitzer

fbshipit-source-id: f021b105e9a3fd4a20d6ee3dfb6a06a8c34b10ca
laurentdupin pushed a commit to laurentdupin/pytorch that referenced this pull request Apr 25, 2026
…#57357)

Summary:
Fixes pytorch#30696

### Release Notes
Instantiating a custom autograd function is now deprecated. Users should call `.apply()` on the class itself because it is a static method.

--end release notes--
 - There are a couple error messages that we can't entirely remove because accessing these attributes of the autograd function instance may segfault (due to cdata being nullptr). Also added a TORCH_CHECK for the name attribute which previously segfaulted.
 - Error message updated to convey 1) old-style functions have been deprecated 2) this access pattern was once valid
 - Updates variable -> Tensor for some error messages

Pull Request resolved: pytorch#57357

Reviewed By: mrshenli

Differential Revision: D28193095

Pulled By: soulitzer

fbshipit-source-id: f021b105e9a3fd4a20d6ee3dfb6a06a8c34b10ca
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Remove deprecated codepath for old-style autograd.Function

3 participants