Skip to content

[quant][graphmode] Different rule for add/add_/mul/mul_#38667

Closed
jerryzh168 wants to merge 7 commits intogh/jerryzh168/321/basefrom
gh/jerryzh168/321/head
Closed

[quant][graphmode] Different rule for add/add_/mul/mul_#38667
jerryzh168 wants to merge 7 commits intogh/jerryzh168/321/basefrom
gh/jerryzh168/321/head

Conversation

@jerryzh168
Copy link
Copy Markdown
Contributor

@jerryzh168 jerryzh168 commented May 18, 2020

Stack from ghstack:

Summary:

Test Plan:

Reviewers:

Subscribers:

Tasks:

Tags:

Differential Revision: D21633555

Summary:

Test Plan:

Reviewers:

Subscribers:

Tasks:

Tags:

[ghstack-poisoned]
@dr-ci
Copy link
Copy Markdown

dr-ci Bot commented May 18, 2020

💊 CI failures summary and remediations

As of commit 5d11776 (more details on the Dr. CI page):


  • 2/2 failures possibly* introduced in this PR
    • 2/2 non-CircleCI failure(s)

Extra GitHub checks: 1 failed


ci.pytorch.org: 1 failed


This comment was automatically generated by Dr. CI (expand for details).Follow this link to opt-out of these comments for your Pull Requests.

Please report bugs/suggestions on the GitHub issue tracker or post in the (internal) Dr. CI Users group.

See how this bot performed.

This comment has been revised 40 times.

Comment thread test/quantization/test_quantize_script.py
Comment thread torch/csrc/jit/passes/quantization/helper.cpp
" with instruction set support avx2 or newer.")
def test_quantized_add(self):
class Add(torch.nn.Module):
class QuantizedAdd(torch.nn.Module):
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

For brevity, you could reuse all the tests for add for mul, if you make the operator itself a parameter when you define the model.

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

that would be ideal if we can do that... I don't know if it will work with torchscript if we make operator as an argument, but we may have a refactor later

Summary:

Test Plan:

Reviewers:

Subscribers:

Tasks:

Tags:

[ghstack-poisoned]
Summary:

Test Plan:

Reviewers:

Subscribers:

Tasks:

Tags:

Differential Revision: [D21633555](https://our.internmc.facebook.com/intern/diff/D21633555)

[ghstack-poisoned]
Summary:

Test Plan:

Reviewers:

Subscribers:

Tasks:

Tags:

Differential Revision: [D21633555](https://our.internmc.facebook.com/intern/diff/D21633555)

[ghstack-poisoned]
Summary:

Test Plan:

Reviewers:

Subscribers:

Tasks:

Tags:

Differential Revision: [D21633555](https://our.internmc.facebook.com/intern/diff/D21633555)

[ghstack-poisoned]
Summary:

Test Plan:

Reviewers:

Subscribers:

Tasks:

Tags:

Differential Revision: [D21633555](https://our.internmc.facebook.com/intern/diff/D21633555)

[ghstack-poisoned]
Summary:

Test Plan:

Reviewers:

Subscribers:

Tasks:

Tags:

Differential Revision: [D21633555](https://our.internmc.facebook.com/intern/diff/D21633555)

[ghstack-poisoned]
"__ixor__",
"__str__",
"__len__",
static constexpr const char* magic_method_names[] = {
Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@ZolotukhinM @suo can I commit this?

@facebook-github-bot
Copy link
Copy Markdown
Contributor

This pull request has been merged in a8d8fc5.

@facebook-github-bot facebook-github-bot deleted the gh/jerryzh168/321/head branch May 24, 2020 14:15
laurentdupin pushed a commit to laurentdupin/pytorch that referenced this pull request Apr 24, 2026
Summary: Pull Request resolved: pytorch#38667

Test Plan: Imported from OSS

Differential Revision: D21633555

fbshipit-source-id: 03b0298e83bf4dbda41b048c0edc7bb92cd4e1df
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Merged oncall: jit Add this issue/PR to JIT oncall triage queue

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants