Make mutation test work with quantized tensors#108935
Make mutation test work with quantized tensors#108935ezyang wants to merge 5 commits intogh/ezyang/2334/basefrom
Conversation
You can't do torch.equal as nan doesn't compare equal, but if you reinterpret the tensors as int8 tensors and do equal that works. As an added bonus, this works with quantized tensors. Signed-off-by: Edward Z. Yang <ezyang@meta.com> [ghstack-poisoned]
🔗 Helpful Links🧪 See artifacts and rendered test results at hud.pytorch.org/pr/108935
Note: Links to docs will display an error until the docs builds have been completed. ✅ You can merge normally! (3 Unrelated Failures)As of commit 9947b31 with merge base 2b138e4 ( BROKEN TRUNK - The following job failed but were present on the merge base:👉 Rebase onto the `viable/strict` branch to avoid these failures
UNSTABLE - The following jobs failed but were likely due to flakiness present on trunk and has been marked as unstable:
This comment was automatically generated by Dr. CI and updates every 15 minutes. |
You can't do torch.equal as nan doesn't compare equal, but if you reinterpret the tensors as int8 tensors and do equal that works. As an added bonus, this works with quantized tensors. Signed-off-by: Edward Z. Yang <ezyangmeta.com> [ghstack-poisoned]
You can't do torch.equal as nan doesn't compare equal, but if you reinterpret the tensors as int8 tensors and do equal that works. As an added bonus, this works with quantized tensors. Signed-off-by: Edward Z. Yang <ezyangmeta.com> [ghstack-poisoned]
You can't do torch.equal as nan doesn't compare equal, but if you reinterpret the tensors as int8 tensors and do equal that works. As an added bonus, this works with quantized tensors. Signed-off-by: Edward Z. Yang <ezyangmeta.com> [ghstack-poisoned]
You can't do torch.equal as nan doesn't compare equal, but if you reinterpret the tensors as int8 tensors and do equal that works. As an added bonus, this works with quantized tensors. Signed-off-by: Edward Z. Yang <ezyangmeta.com> [ghstack-poisoned]
|
@pytorchbot merge -f "regular flow is sus" |
Merge startedYour change will be merged immediately since you used the force (-f) flag, bypassing any CI checks (ETA: 1-5 minutes). Please use Learn more about merging in the wiki. Questions? Feedback? Please reach out to the PyTorch DevX Team |
|
|
||
| def __torch_dispatch__(self, func, types, args=(), kwargs=None): | ||
| def bitwise_equal(lhs, rhs): | ||
| if lhs.is_quantized: |
There was a problem hiding this comment.
do we actually have this use case? this is tracing a model quantized in eager mode quantization?
There was a problem hiding this comment.
I was working applying this cross ref test to more operators, and the quantized ones started failing, so I fixed it with this. This is testing code.
Stack from ghstack (oldest at bottom):
Signed-off-by: Edward Z. Yang ezyang@meta.com