-
Notifications
You must be signed in to change notification settings - Fork 27.7k
PowBackward always operates in Complex dtype #46936
Copy link
Copy link
Closed
Labels
module: complexRelated to complex number support in PyTorchRelated to complex number support in PyTorchmodule: performanceIssues related to performance, either of kernel code or framework glueIssues related to performance, either of kernel code or framework gluemodule: regressionIt used to work, and now it doesn'tIt used to work, and now it doesn'ttriagedThis issue has been looked at a team member, and triaged and prioritized into an appropriate moduleThis issue has been looked at a team member, and triaged and prioritized into an appropriate module
Metadata
Metadata
Assignees
Labels
module: complexRelated to complex number support in PyTorchRelated to complex number support in PyTorchmodule: performanceIssues related to performance, either of kernel code or framework glueIssues related to performance, either of kernel code or framework gluemodule: regressionIt used to work, and now it doesn'tIt used to work, and now it doesn'ttriagedThis issue has been looked at a team member, and triaged and prioritized into an appropriate moduleThis issue has been looked at a team member, and triaged and prioritized into an appropriate module
🐛 Bug
autoin the following line is always deduced to be c10::complex:pytorch/torch/csrc/autograd/FunctionsManual.cpp
Line 188 in bbe5bfa
This leads to PowBackward always operating on complex tensors due to type promotion:
pytorch/torch/csrc/autograd/FunctionsManual.cpp
Line 192 in bbe5bfa
which should lead to a performance regression.
cc @ezyang @anjali411 @dylanbespalko @mruberry @VitalyFedyunin @ngimel