cleanup static_cast of AutogradMeta#54103
Conversation
[ghstack-poisoned]
💊 CI failures summary and remediationsAs of commit 860639e (more details on the Dr. CI page):
This comment was automatically generated by Dr. CI (expand for details).Follow this link to opt-out of these comments for your Pull Requests.Please report bugs/suggestions to the (internal) Dr. CI Users group. |
| return static_cast<AutogradMeta*>(self.unsafeGetTensorImpl()->autograd_meta()); | ||
| } | ||
|
|
||
| DifferentiableViewMeta* get_view_autograd_meta(const Variable& self) { |
There was a problem hiding this comment.
nit: Would we want to use Tensor here? or do we keep Variable for consistency.
There was a problem hiding this comment.
Yes we have a using Variable = at::Tensor at the top of the header.
We should remove that and rename all of them I think yes.
The goal is to reduce the spread of static casts in the autograd code as per the comment in #49097 (comment) I wasn't sure how to use a virtual method here but a simple method in impl clean it up quite nicely. [ghstack-poisoned]
|
This seems OK, but given that all the call sites are testing for view-ness before calling the cast, it seems like it might just be better to make this function return a nullptr/ullopt when it's not castable, and then force callers to test the return output. |
ghstack-source-id: 4cdb26e Pull Request resolved: pytorch#54103
The goal is to reduce the spread of static casts in the autograd code as per the comment in #49097 (comment) I wasn't sure how to use a virtual method here but a simple method in impl clean it up quite nicely. Differential Revision: [D27117840](https://our.internmc.facebook.com/intern/diff/D27117840) [ghstack-poisoned]
|
@ezyang I updated following your recommendation. Does that look better? It does make the code simpler but it is a less clear that we use this to check if a Tensor is a view. |
|
One way to make it clearer is to jiggle the naming; e.g., |
The `.is_view()` method actually only refers to backward mode views This is not a problem right now in master (and thus I didn't revert the other PR) because nothing creates forward AD views. [ghstack-poisoned]
The `.is_view()` method actually only refers to backward mode views This is not a problem right now in master (and thus I didn't revert the other PR) because nothing creates forward AD views. [ghstack-poisoned]
ghstack-source-id: b26a8be Pull Request resolved: pytorch#54610
Summary: Pull Request resolved: #54610 The `.is_view()` method actually only refers to backward mode views This is not a problem right now in master (and thus I didn't revert the other PR) because nothing creates forward AD views. Test Plan: Imported from OSS Reviewed By: gchanan Differential Revision: D27396756 Pulled By: albanD fbshipit-source-id: 64ff11c6f2486c6430714988d1cf6ecf3d80dccb
The goal is to reduce the spread of static casts in the autograd code as per the comment in #49097 (comment)
I wasn't sure how to use a virtual method here but a simple method in impl clean it up quite nicely.
Stack from ghstack:
Differential Revision: D27117840