Skip to content

Implement backward for torch.symeig#8586

Closed
sethah wants to merge 12 commits intopytorch:masterfrom
sethah:symeig_back
Closed

Implement backward for torch.symeig#8586
sethah wants to merge 12 commits intopytorch:masterfrom
sethah:symeig_back

Conversation

@sethah
Copy link
Contributor

@sethah sethah commented Jun 17, 2018

Partially fixes #6890. (backward pass for non-symmetric eigen-decomposition is not implemented in other packages, e.g. autograd, mxnet, tensorflow, presumably because the eigenvalues can be imaginary for the general case, and AFAIK we cannot support complex numbers).

This patch adds a backward function for the symmetric eigen-decomposition function torch.symeig. The formula used is taken from here. Unit tests are added to verify correctness.

There is still one outstanding issue, which is how to handle the case where the symeig is called with eigenvectors=False. In this case, the eigenvectors are returned as a zero tensor, but the backward computation for the eigenvalues depends on the eigenvectors. There was a previous attempt to implement this in #2026, where @apaszke mentioned that the eigenvectors argument should be overridden so that they are saved for the backwards pass. The forward code is autogenerated, though, and it isn't clear to me how that would be done. I'd appreciate any guidance. For now, there is a unit test that will fail until that issue is resolved.

@t-vi
Copy link
Collaborator

t-vi commented Jun 27, 2018

I think the backward should at least explicitly fail when the eigenvectors are not kept. Then the user will know what to do to fix it.

@ssnl
Copy link
Collaborator

ssnl commented Jun 27, 2018

I agree with @t-vi . Let's just fail when eigenvector = false now.

One can work around this by make symeig always compute eigenvector and expose a wrapper function to Python. But that is inefficient when users don't want gradients.

We should have a way to specify grad mode on vs. off forward. I'll post an issue on that. But for this PR, let's just make it an error.

@ssnl
Copy link
Collaborator

ssnl commented Jun 27, 2018

Can you make this change and add a test that assert an except is throw when eigenvector=false in backward?

@sethah
Copy link
Contributor Author

sethah commented Jun 28, 2018

I added a test to check the backwards call fails. I wasn't sure if this test needs a cuda version, so if it does then I'll be happy to add it.

I'll fix merge conflicts shortly.

@sethah
Copy link
Contributor Author

sethah commented Jul 7, 2018

ping @ssnl

Copy link
Contributor

@facebook-github-bot facebook-github-bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@ezyang has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.

Copy link
Collaborator

@ssnl ssnl left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

sorry for the late review! looks great except for some minor nits!

if (gv.defined()) {
Tensor F = at::zeros_like(self);
F.add_(at::unsqueeze(lambda, 0)).sub_(at::unsqueeze(lambda, 1));
F.as_strided({n}, {n + 1}).fill_(INFINITY);

This comment was marked as off-topic.

Tensor result = at::zeros_like(self);
if (gv.defined()) {
Tensor F = at::zeros_like(self);
F.add_(at::unsqueeze(lambda, 0)).sub_(at::unsqueeze(lambda, 1));

This comment was marked as off-topic.

"computing eigenvectors in forward pass"));
}

Tensor result = at::zeros_like(self);

This comment was marked as off-topic.

@ezyang
Copy link
Contributor

ezyang commented Jul 23, 2018

Hey @sethah, have you had time to handle the PR comments?

@sethah
Copy link
Contributor Author

sethah commented Jul 23, 2018

@ssnl Sorry for the delay, I think my latest commit should cover the feedback you provided. Thanks!

Copy link
Contributor

@facebook-github-bot facebook-github-bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@ssnl has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.

Copy link
Collaborator

@ssnl ssnl left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks!!

Copy link
Contributor

@facebook-github-bot facebook-github-bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@ssnl has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.

Copy link
Contributor

@facebook-github-bot facebook-github-bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@ssnl is landing this pull request. If you are a Facebook employee, you can view this diff on Phabricator.

jramseyer pushed a commit to jramseyer/pytorch that referenced this pull request Jul 30, 2018
Summary:
Partially fixes pytorch#6890. (backward pass for non-symmetric eigen-decomposition is not implemented in other packages, e.g. autograd, mxnet, tensorflow, presumably because the eigenvalues can be imaginary for the general case, and AFAIK we cannot support complex numbers).

This patch adds a backward function for the symmetric eigen-decomposition function `torch.symeig`. The formula used is taken from [here](http://eprints.maths.ox.ac.uk/1079/1/NA-08-01.pdf). Unit tests are added to verify correctness.

There is still one outstanding issue, which is how to handle the case where the `symeig` is called with `eigenvectors=False`. In this case, the eigenvectors are returned as a zero tensor, but the backward computation for the eigenvalues depends on the eigenvectors. There was a previous attempt to implement this in pytorch#2026, where apaszke mentioned that the `eigenvectors` argument should be overridden so that they are saved for the backwards pass. The forward code is autogenerated, though, and it isn't clear to me how that would be done. I'd appreciate any guidance. For now, there is a unit test that will fail until that issue is resolved.
Pull Request resolved: pytorch#8586

Reviewed By: ezyang

Differential Revision: D8872760

Pulled By: SsnL

fbshipit-source-id: 76614495d0f9c118fec163a428f32e5480b4d115
@zou3519 zou3519 mentioned this pull request Jul 31, 2018
goodlux pushed a commit to goodlux/pytorch that referenced this pull request Aug 15, 2018
Summary:
Partially fixes pytorch#6890. (backward pass for non-symmetric eigen-decomposition is not implemented in other packages, e.g. autograd, mxnet, tensorflow, presumably because the eigenvalues can be imaginary for the general case, and AFAIK we cannot support complex numbers).

This patch adds a backward function for the symmetric eigen-decomposition function `torch.symeig`. The formula used is taken from [here](http://eprints.maths.ox.ac.uk/1079/1/NA-08-01.pdf). Unit tests are added to verify correctness.

There is still one outstanding issue, which is how to handle the case where the `symeig` is called with `eigenvectors=False`. In this case, the eigenvectors are returned as a zero tensor, but the backward computation for the eigenvalues depends on the eigenvectors. There was a previous attempt to implement this in pytorch#2026, where apaszke mentioned that the `eigenvectors` argument should be overridden so that they are saved for the backwards pass. The forward code is autogenerated, though, and it isn't clear to me how that would be done. I'd appreciate any guidance. For now, there is a unit test that will fail until that issue is resolved.
Pull Request resolved: pytorch#8586

Reviewed By: ezyang

Differential Revision: D8872760

Pulled By: SsnL

fbshipit-source-id: 76614495d0f9c118fec163a428f32e5480b4d115
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

[feature request] the derivative for 'eig' is not implemented

5 participants