stft: Change require_complex warning to an error#49022
stft: Change require_complex warning to an error#49022peterbell10 wants to merge 6 commits intogh/peterbell10/34/basefrom
Conversation
[ghstack-poisoned]
[ghstack-poisoned]
[ghstack-poisoned]
💊 CI failures summary and remediationsAs of commit a643614 (more details on the Dr. CI page):
🕵️ 1 new failure recognized by patternsThe following CI failures do not appear to be due to upstream breakages:
|
[ghstack-poisoned]
[ghstack-poisoned]
| } | ||
| TORCH_CHECK( | ||
| return_complexOpt.has_value() || return_complex, | ||
| "stft requires the return_complex parameter be explicitly " |
There was a problem hiding this comment.
This is good. Do you think we should also take this opportunity to deprecate return_complex=False?
| Setting :attr:`return_complex` explicitly will be required in a future | ||
| PyTorch release. Set it to False to preserve the current behavior or | ||
| True to return a complex output. | ||
| From version 1.8.0, :attr:`return_complex` must be given explicitly for |
There was a problem hiding this comment.
This warning seems great. Same question about whether we should deprecate return_complex=False now, too?
Seems like we should? I don't think we intend to support float tensors as surrogates for complex tensors long term.
ghstack-source-id: 0c3cabe Pull Request resolved: pytorch#49022
[ghstack-poisoned]
|
@mruberry I've rebased on Test failures look unrelated. |
| "(which will be required in a future pytorch release). " | ||
| ); | ||
|
|
||
| TORCH_WARN_ONCE( |
There was a problem hiding this comment.
This warning needs to be gated on the value actually being false, though. We don't want people who set return_complex=True to see it.
There was a problem hiding this comment.
Oh wait... I understand how this will work now. OK. That's cool.
So the conditional only triggers if return_complex is unspecified or False and it needs to be specified. Then the check will capture the case where it's unspecified. If the check is hit this function will stop (since it threw an exception). If not, then the value is specified but it's False, so the warning is thrown.
Cool.
|
Hey @peterbell10, my mistake on landing this a little too swiftly. It looks like torchaudio has yet to update their stft calls. Let me ping them to do that before we land this so we don't break people using them. |
|
Torchaudio issue: pytorch/audio#1095 |
Summary: Pull Request resolved: pytorch#49022 Test Plan: Imported from OSS Reviewed By: ngimel Differential Revision: D25569586 Pulled By: mruberry fbshipit-source-id: 09608088f540c2c3fc70465f6a23f2aec5f24f85
Summary: Pull Request resolved: pytorch#49022 Test Plan: Imported from OSS Reviewed By: ngimel Differential Revision: D25569586 Pulled By: mruberry fbshipit-source-id: 09608088f540c2c3fc70465f6a23f2aec5f24f85
Summary: Pull Request resolved: pytorch#49022 **BC-breaking note**: Previously torch.stft took an optional `return_complex` parameter that indicated whether the output would be a floating point tensor or a complex tensor. By default `return_complex` was False to be consistent with the previous behavior of torch.stft. This PR changes this behavior so `return_complex` is a required argument. **PR Summary**: * **pytorch#49022 stft: Change require_complex warning to an error** Test Plan: Imported from OSS Reviewed By: ngimel Differential Revision: D25658906 Pulled By: mruberry fbshipit-source-id: 11932d1102e93f8c7bd3d2d0b2a607fd5036ec5e
BC-breaking note:
Previously torch.stft took an optional
return_complexparameter that indicated whether the output would be a floating point tensor or a complex tensor. By defaultreturn_complexwas False to be consistent with the previous behavior of torch.stft. This PR changes this behavior soreturn_complexis a required argument.PR Summary:
Stack from ghstack:
Differential Revision: D25658906