-
Notifications
You must be signed in to change notification settings - Fork 27.7k
torch.lobpcg always breaks for autograd #38948
Copy link
Copy link
Closed
Labels
module: autogradRelated to torch.autograd, and the autograd engine in generalRelated to torch.autograd, and the autograd engine in generalmodule: linear algebraIssues related to specialized linear algebra operations in PyTorch; includes matrix multiply matmulIssues related to specialized linear algebra operations in PyTorch; includes matrix multiply matmultriagedThis issue has been looked at a team member, and triaged and prioritized into an appropriate moduleThis issue has been looked at a team member, and triaged and prioritized into an appropriate module
Metadata
Metadata
Assignees
Labels
module: autogradRelated to torch.autograd, and the autograd engine in generalRelated to torch.autograd, and the autograd engine in generalmodule: linear algebraIssues related to specialized linear algebra operations in PyTorch; includes matrix multiply matmulIssues related to specialized linear algebra operations in PyTorch; includes matrix multiply matmultriagedThis issue has been looked at a team member, and triaged and prioritized into an appropriate moduleThis issue has been looked at a team member, and triaged and prioritized into an appropriate module
🐛 Bug
It seems that
torch.lobpcg(https://pytorch.org/docs/stable/torch.html?highlight=lobpcg#torch.lobpcg) just always breaks when trying to take gradients viabackward.To Reproduce
Here's a minimalist example showing
lobpcgbreaking.Running that code produces the following error.
I have a feeling that the problem is that
torch.lobpcg's implementation is using an in-place operation when it shouldn't be.This happened when running
torch.__version__ == '1.5.0+cpu'installed with pip on Windows 10 WSL (Windows Subsystem for Linux) on Python 3.5.2.Can this be fixed, or is
torch.lobpcgnot meant to support autograd?cc @ezyang @albanD @zou3519 @gqchen @pearu @nikitaved @vincentqb @vishwakftw @jianyuh @mruberry @ssnl