[quant][graphmode] Quantizing traced modules#39826
[quant][graphmode] Quantizing traced modules#39826jerryzh168 wants to merge 13 commits intogh/jerryzh168/340/basefrom
Conversation
Summary: Expanding operator test coverage to traced modules Test Plan: Reviewers: Subscribers: Tasks: Tags: [ghstack-poisoned]
Summary: Expanding operator test coverage to traced modules Test Plan: Reviewers: Subscribers: Tasks: Tags: [ghstack-poisoned]
💊 CI failures summary and remediationsAs of commit 7fda2e9 (more details on the Dr. CI page):
🚧 1 fixed upstream failure:These were probably caused by upstream breakages that were already fixed.
Please rebase on the
|
Summary: Expanding operator test coverage to traced modules Test Plan: Reviewers: Subscribers: Tasks: Tags: [ghstack-poisoned]
Summary: Expanding operator test coverage to traced modules Test Plan: Reviewers: Subscribers: Tasks: Tags: [ghstack-poisoned]
Summary: Expanding operator test coverage to traced modules Test Plan: Reviewers: Subscribers: Tasks: Tags: Differential Revision: [D21991266](https://our.internmc.facebook.com/intern/diff/D21991266) [ghstack-poisoned]
Summary: Expanding operator test coverage to traced modules Test Plan: Reviewers: Subscribers: Tasks: Tags: Differential Revision: [D21991266](https://our.internmc.facebook.com/intern/diff/D21991266) [ghstack-poisoned]
Summary: Expanding operator test coverage to traced modules Test Plan: Reviewers: Subscribers: Tasks: Tags: Differential Revision: [D21991266](https://our.internmc.facebook.com/intern/diff/D21991266) [ghstack-poisoned]
vkuzo
left a comment
There was a problem hiding this comment.
accepting to unblock. Would be great if we added some comments inline on the why behind these changes, i.e. what specifically is different between the graphs generated by scripting vs tracing.
| fn(*args, **kwargs) | ||
| return wrapper | ||
|
|
||
| def get_script_module(model, tracing, data): |
There was a problem hiding this comment.
this is used in test, you probably missed the changes in test_quantize_script.py, it's too large so the change is not displayed by default in github..
| } | ||
|
|
||
| template <bool ReLUFused = false> | ||
| Tensor qadd_scalar_tensor(Tensor qa, Tensor b) { |
There was a problem hiding this comment.
wondering if there is context for this...b is a tensor which represents a scalar? Is this related to how tracing works? Maybe we can add a comment?
There was a problem hiding this comment.
yeah, will add a comment. tracing will trace a Scalar as tensor
There was a problem hiding this comment.
Wondering if the comment is accurate: add and add_scalar cannot be merged as add_scalar does not require a scale and zero-point.
There was a problem hiding this comment.
it's a little bit confusing, I meant when we add boradcasting support, we can merge add_scalar with that op, a separate task is to merge all variations of quantized::add into quantized::add, since we have overloading.
|
|
||
| const PatternInfo mul_nn_relu = PatternInfo::parse_from_str(R"( | ||
| graph(%self, %a, %b): | ||
| graph(%self, %a, %b, %relu): |
There was a problem hiding this comment.
more for my own understanding, this is not a logic change and just moving %relu out, right?
There was a problem hiding this comment.
this is a logic change, we are improving the support here to be able to match relu modules used multiple times and the matching in is_relu_module is also more accurate than match::module["ReLU"]
Summary: Expanding operator test coverage to traced modules Test Plan: Reviewers: Subscribers: Tasks: Tags: Differential Revision: [D21991266](https://our.internmc.facebook.com/intern/diff/D21991266) [ghstack-poisoned]
Summary: Expanding operator test coverage to traced modules Test Plan: Reviewers: Subscribers: Tasks: Tags: Differential Revision: [D21991266](https://our.internmc.facebook.com/intern/diff/D21991266) [ghstack-poisoned]
| def __init__(self, dim, inplace): | ||
| super(ConvNdRelu, self).__init__() | ||
| self.conv = conv_module[dim](1, 4, 2, 3).float() | ||
| self.conv = conv_module[dim](3, 3, 3).float() |
There was a problem hiding this comment.
Why is the dimension changed here?
There was a problem hiding this comment.
you mean the argument? they are changed because I changed the input dimensions
Summary: Expanding operator test coverage to traced modules Test Plan: Reviewers: Subscribers: Tasks: Tags: Differential Revision: [D21991266](https://our.internmc.facebook.com/intern/diff/D21991266) [ghstack-poisoned]
Summary: Expanding operator test coverage to traced modules Test Plan: Reviewers: Subscribers: Tasks: Tags: Differential Revision: [D21991266](https://our.internmc.facebook.com/intern/diff/D21991266) [ghstack-poisoned]
Summary: Expanding operator test coverage to traced modules Test Plan: Reviewers: Subscribers: Tasks: Tags: Differential Revision: [D21991266](https://our.internmc.facebook.com/intern/diff/D21991266) [ghstack-poisoned]
Summary: Expanding operator test coverage to traced modules Test Plan: Reviewers: Subscribers: Tasks: Tags: Differential Revision: [D21991266](https://our.internmc.facebook.com/intern/diff/D21991266) [ghstack-poisoned]
|
This pull request has been merged in 246d7bb. |
1 similar comment
|
This pull request has been merged in 246d7bb. |
Summary: Pull Request resolved: pytorch#39826 Expanding operator test coverage to traced modules Test Plan: Imported from OSS Differential Revision: D21991266 fbshipit-source-id: 73b1d94caa6ad41bb0d6cbde7ba0de343da3e7ff
Summary: Pull Request resolved: pytorch#39826 Expanding operator test coverage to traced modules Test Plan: Imported from OSS Differential Revision: D21991266 fbshipit-source-id: 73b1d94caa6ad41bb0d6cbde7ba0de343da3e7ff
Stack from ghstack:
repeat#39925 [quant][graphmode] Support quantizingrepeatSummary:
Expanding operator test coverage to traced modules
Test Plan:
Reviewers:
Subscribers:
Tasks:
Tags:
Differential Revision: D21991266