Move scalar_to_tensor_default_dtype out of ScalarOps.h because it's only useful for torch.where.#50111
Move scalar_to_tensor_default_dtype out of ScalarOps.h because it's only useful for torch.where.#50111gchanan wants to merge 7 commits intogh/gchanan/349/basefrom
Conversation
…nly useful for torch.where.
💊 CI failures summary and remediationsAs of commit 033a616 (more details on the Dr. CI page):
1 job timed out:
🚧 1 fixed upstream failure:These were probably caused by upstream breakages that were already fixed.
Please rebase on the
|
…ause it's only useful for torch.where." Differential Revision: [D25789638](https://our.internmc.facebook.com/intern/diff/D25789638)
…ause it's only useful for torch.where." Differential Revision: [D25789638](https://our.internmc.facebook.com/intern/diff/D25789638)
…ause it's only useful for torch.where." Differential Revision: [D25789638](https://our.internmc.facebook.com/intern/diff/D25789638)
…ause it's only useful for torch.where." Differential Revision: [D25789638](https://our.internmc.facebook.com/intern/diff/D25789638)
…ause it's only useful for torch.where." Differential Revision: [D25789638](https://our.internmc.facebook.com/intern/diff/D25789638)
| namespace { | ||
|
|
||
| static Tensor wrapped_scalar_tensor( | ||
| at::Tensor scalar_to_tensor_default_dtype( |
There was a problem hiding this comment.
This function must still deserve some kind of comment, right? Even if it's "DO NOT USE THIS".
There was a problem hiding this comment.
oh, I didn't do that because it's just an implementation detail of wrapped_scalar_tensor below -- I'll make a note of that.
| return at::scalar_tensor( | ||
| s, at::device(device).dtype(at::get_default_complex_dtype())); | ||
| } else { | ||
| AT_ASSERT(s.isIntegral(false)); |
There was a problem hiding this comment.
AT_ASSERT -> TORCH_INTERNAL_ASSERT
| } | ||
| } | ||
|
|
||
| // `use_default_dtype` is a bit of a hack because torch.where doesn't support type promotion, but |
There was a problem hiding this comment.
Wow this comment really dives in!
Is this function only intended to be used by torch.where? A sentence upfront about its intended use would be helpful. In particular, if torch.where supported type promotion properly would you expect us to get rid of this function, and should this comment tell developers not to use this function?
There was a problem hiding this comment.
ya, I can add a sentence upfront summarizing.
…ause it's only useful for torch.where." Differential Revision: [D25789638](https://our.internmc.facebook.com/intern/diff/D25789638)
…nly useful for torch.where. (pytorch#50111) Summary: Pull Request resolved: pytorch#50111 Test Plan: Imported from OSS Reviewed By: mruberry Differential Revision: D25789638 Pulled By: gchanan fbshipit-source-id: 4254e11e08606b64e393433ef2c169889ff2ac07
Stack from ghstack:
Differential Revision: D25789638