Added type promotion logic for complex numbers#34093
Added type promotion logic for complex numbers#34093anjali411 wants to merge 10 commits intopytorch:masterfrom
Conversation
💊 CircleCI build failures summary and remediationsAs of commit 9f86e56 (more details on the Dr. CI page):
🕵️ 1 new failure recognized by patternsThe following build failures do not appear to be due to upstream breakages (reran 1 job to discount flakiness):
|
fbea963 to
2317698
Compare
6835e9d to
960af5c
Compare
|
I'm going to let @nairbv take this one in its entirety |
There was a problem hiding this comment.
question: is it a conscious decision that a complex type is not also considered a floating type? I could see considering complex to be a special case of floating types.
There was a problem hiding this comment.
I also think complex should be considered a floating point type
There was a problem hiding this comment.
will make a follow-up PR for this
There was a problem hiding this comment.
let's update this to be else if anyway though so it's future proof to that change.
960af5c to
12ddcfd
Compare
c5e6ec3 to
51389dc
Compare
test/test_type_promotion.py
Outdated
There was a problem hiding this comment.
using randn because we don't have rand for complex dtype rn #34380
b2ea48f to
c550bfe
Compare
|
change isFloatingPoint || isComplex to isFloatingPoint after #34392 |
facebook-github-bot
left a comment
There was a problem hiding this comment.
@anjali411 has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.
|
cc. @ailzhang for the xla test failure |
test/test_type_promotion.py
Outdated
There was a problem hiding this comment.
@mruberry Can you suggest how to skip complex dtypes entirely for XLA? ;) Like the half types?
There was a problem hiding this comment.
Try adding the unsupported types here: https://github.com/pytorch/xla/blob/c77641f460133b0f1c75ac8c3533b987278b5c25/torch_patches/X10-device_test.diff#L46
There was a problem hiding this comment.
Hmmm that only works when dtype appears in the decorator and it's skipped in instantiation of tests. But here we explicitly call with complex dtypes so it doesn't work. (please correct me if I'm wrong).
@anjali411 would you mind refactor these tests to keep float and complex in separate tests? They can call into common helper functions. Since type promotion is pretty major behavior, I do want to keep XLA running the part they're able to run. Thanks a lot!
test/test_type_promotion.py
Outdated
There was a problem hiding this comment.
And this test too. Thanks!!
docs/source/tensor_attributes.rst
Outdated
There was a problem hiding this comment.
I thought we weren't going to add e.g. torch.*ComplexFloatTensor? IMO we should change the "Tensor types" header to "Legacy constructor"
There was a problem hiding this comment.
I made these changes before our discussion. will remove the changes and also updated the header title
|
Unlanding. This is causing failures on master: Maybe a missing import? |
91c6a50 to
3d99fc8
Compare
yeah my bad I needed to call torch.testing.get_all_complex_dtypes() instead of get_all_complex_dtypes(). thanks!! |
facebook-github-bot
left a comment
There was a problem hiding this comment.
@anjali411 has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.
4c01324 to
72d7df4
Compare
facebook-github-bot
left a comment
There was a problem hiding this comment.
@anjali411 has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.
38fe982 to
d19a6ff
Compare
…ated test_many_promotions, added alias for complex types and other minor changes
d19a6ff to
9f86e56
Compare
facebook-github-bot
left a comment
There was a problem hiding this comment.
@anjali411 has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.
|
@anjali411 merged this pull request in c73e970. |
Issue: #33780
After this PR: