Implement backward for torch.symeig#8586
Conversation
|
I think the backward should at least explicitly fail when the eigenvectors are not kept. Then the user will know what to do to fix it. |
|
I agree with @t-vi . Let's just fail when One can work around this by make We should have a way to specify grad mode on vs. off forward. I'll post an issue on that. But for this PR, let's just make it an error. |
|
Can you make this change and add a test that assert an except is throw when |
|
I added a test to check the backwards call fails. I wasn't sure if this test needs a cuda version, so if it does then I'll be happy to add it. I'll fix merge conflicts shortly. |
|
ping @ssnl |
facebook-github-bot
left a comment
There was a problem hiding this comment.
@ezyang has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.
ssnl
left a comment
There was a problem hiding this comment.
sorry for the late review! looks great except for some minor nits!
| if (gv.defined()) { | ||
| Tensor F = at::zeros_like(self); | ||
| F.add_(at::unsqueeze(lambda, 0)).sub_(at::unsqueeze(lambda, 1)); | ||
| F.as_strided({n}, {n + 1}).fill_(INFINITY); |
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
| Tensor result = at::zeros_like(self); | ||
| if (gv.defined()) { | ||
| Tensor F = at::zeros_like(self); | ||
| F.add_(at::unsqueeze(lambda, 0)).sub_(at::unsqueeze(lambda, 1)); |
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
| "computing eigenvectors in forward pass")); | ||
| } | ||
|
|
||
| Tensor result = at::zeros_like(self); |
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
|
Hey @sethah, have you had time to handle the PR comments? |
|
@ssnl Sorry for the delay, I think my latest commit should cover the feedback you provided. Thanks! |
facebook-github-bot
left a comment
There was a problem hiding this comment.
@ssnl has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.
facebook-github-bot
left a comment
There was a problem hiding this comment.
@ssnl has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.
facebook-github-bot
left a comment
There was a problem hiding this comment.
@ssnl is landing this pull request. If you are a Facebook employee, you can view this diff on Phabricator.
Summary: Partially fixes pytorch#6890. (backward pass for non-symmetric eigen-decomposition is not implemented in other packages, e.g. autograd, mxnet, tensorflow, presumably because the eigenvalues can be imaginary for the general case, and AFAIK we cannot support complex numbers). This patch adds a backward function for the symmetric eigen-decomposition function `torch.symeig`. The formula used is taken from [here](http://eprints.maths.ox.ac.uk/1079/1/NA-08-01.pdf). Unit tests are added to verify correctness. There is still one outstanding issue, which is how to handle the case where the `symeig` is called with `eigenvectors=False`. In this case, the eigenvectors are returned as a zero tensor, but the backward computation for the eigenvalues depends on the eigenvectors. There was a previous attempt to implement this in pytorch#2026, where apaszke mentioned that the `eigenvectors` argument should be overridden so that they are saved for the backwards pass. The forward code is autogenerated, though, and it isn't clear to me how that would be done. I'd appreciate any guidance. For now, there is a unit test that will fail until that issue is resolved. Pull Request resolved: pytorch#8586 Reviewed By: ezyang Differential Revision: D8872760 Pulled By: SsnL fbshipit-source-id: 76614495d0f9c118fec163a428f32e5480b4d115
Summary: Partially fixes pytorch#6890. (backward pass for non-symmetric eigen-decomposition is not implemented in other packages, e.g. autograd, mxnet, tensorflow, presumably because the eigenvalues can be imaginary for the general case, and AFAIK we cannot support complex numbers). This patch adds a backward function for the symmetric eigen-decomposition function `torch.symeig`. The formula used is taken from [here](http://eprints.maths.ox.ac.uk/1079/1/NA-08-01.pdf). Unit tests are added to verify correctness. There is still one outstanding issue, which is how to handle the case where the `symeig` is called with `eigenvectors=False`. In this case, the eigenvectors are returned as a zero tensor, but the backward computation for the eigenvalues depends on the eigenvectors. There was a previous attempt to implement this in pytorch#2026, where apaszke mentioned that the `eigenvectors` argument should be overridden so that they are saved for the backwards pass. The forward code is autogenerated, though, and it isn't clear to me how that would be done. I'd appreciate any guidance. For now, there is a unit test that will fail until that issue is resolved. Pull Request resolved: pytorch#8586 Reviewed By: ezyang Differential Revision: D8872760 Pulled By: SsnL fbshipit-source-id: 76614495d0f9c118fec163a428f32e5480b4d115
Partially fixes #6890. (backward pass for non-symmetric eigen-decomposition is not implemented in other packages, e.g. autograd, mxnet, tensorflow, presumably because the eigenvalues can be imaginary for the general case, and AFAIK we cannot support complex numbers).
This patch adds a backward function for the symmetric eigen-decomposition function
torch.symeig. The formula used is taken from here. Unit tests are added to verify correctness.There is still one outstanding issue, which is how to handle the case where the
symeigis called witheigenvectors=False. In this case, the eigenvectors are returned as a zero tensor, but the backward computation for the eigenvalues depends on the eigenvectors. There was a previous attempt to implement this in #2026, where @apaszke mentioned that theeigenvectorsargument should be overridden so that they are saved for the backwards pass. The forward code is autogenerated, though, and it isn't clear to me how that would be done. I'd appreciate any guidance. For now, there is a unit test that will fail until that issue is resolved.