Skip to content

[quant][graphmode] Support propagate dequantize for nodes with multiple outputs#39551

Closed
jerryzh168 wants to merge 8 commits intogh/jerryzh168/335/basefrom
gh/jerryzh168/335/head
Closed

[quant][graphmode] Support propagate dequantize for nodes with multiple outputs#39551
jerryzh168 wants to merge 8 commits intogh/jerryzh168/335/basefrom
gh/jerryzh168/335/head

Conversation

@jerryzh168
Copy link
Copy Markdown
Contributor

@jerryzh168 jerryzh168 commented Jun 5, 2020

Stack from ghstack:

Summary:
e.g. prim::ListUnpack

Test Plan:

Reviewers:

Subscribers:

Tasks:

Tags:

Differential Revision: D21942108

…le outputs

Summary:
e.g. prim::ListUnpack

Test Plan:

Reviewers:

Subscribers:

Tasks:

Tags:

[ghstack-poisoned]
…with multiple outputs"

Summary:
e.g. prim::ListUnpack

Test Plan:

Reviewers:

Subscribers:

Tasks:

Tags:

[ghstack-poisoned]
jerryzh168 added a commit that referenced this pull request Jun 5, 2020
…le outputs

Summary:
e.g. prim::ListUnpack

Test Plan:

Reviewers:

Subscribers:

Tasks:

Tags:

ghstack-source-id: be66eb3
Pull Request resolved: #39551
@dr-ci
Copy link
Copy Markdown

dr-ci Bot commented Jun 5, 2020

💊 CI failures summary and remediations

As of commit 1952a4e (more details on the Dr. CI page):


💚 💚 Looks good so far! There are no failures yet. 💚 💚


This comment was automatically generated by Dr. CI (expand for details).Follow this link to opt-out of these comments for your Pull Requests.

Please report bugs/suggestions on the GitHub issue tracker or post in the (internal) Dr. CI Users group.

See how this bot performed.

This comment has been revised 31 times.

…with multiple outputs"

Summary:
e.g. prim::ListUnpack

Test Plan:

Reviewers:

Subscribers:

Tasks:

Tags:

[ghstack-poisoned]
…with multiple outputs"

Summary:
e.g. prim::ListUnpack

Test Plan:

Reviewers:

Subscribers:

Tasks:

Tags:

[ghstack-poisoned]
return x + y
x = torch.randn([1, 3, 10, 10], dtype=torch.float)
x = torch.quantize_per_tensor(x, 0.5, 1, torch.quint8)
input = [x, x]
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should we also add a test for tracing here?

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think we probably need people to expand all tests with tracing

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this one probably doesn't make much difference, but I can add that

return x + y
x = torch.randn([1, 3, 10, 10], dtype=torch.float)
x = torch.quantize_per_tensor(x, 0.5, 1, torch.quint8)
input = [x, x]
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do both quantize_per_tensor and dequantize now support list of tensors?

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yeah that's correct: https://github.com/pytorch/pytorch/blob/master/aten/src/ATen/native/native_functions.yaml#L3580, but I think we don't really need the list version of quantize_per_tensor right now

.check("dequantize") \
.run(m.graph)
res = get_forward(m._c)(x)
res = m(input)
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Wouldnt the listunpack op be needed prior to the maxpool and avgpool even after the swap dequantize pass?

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yeah, it's still there. I forgot to add the check for that

…with multiple outputs"

Summary:
e.g. prim::ListUnpack

Test Plan:

Reviewers:

Subscribers:

Tasks:

Tags:

[ghstack-poisoned]
…with multiple outputs"

Summary:
e.g. prim::ListUnpack

Test Plan:

Reviewers:

Subscribers:

Tasks:

Tags:

[ghstack-poisoned]
}
}
} else {
// For this category of ops, we need to
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What is the mechanism for warning/providing errors for quantization? In this case, it will be good to error out with a specific message.

if (n->outputs().size() > 1) {
        // Factoring out dequantize for if blocks with multiple outputs	        // Factoring out dequantize for if blocks with multiple outputs
        // is not supported right now	        // is not supported right now
        continue;	        continue;
      }	      }

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this should probably be a warning since we might not need this functionality when there are multiple outputs, and the result would be incorrect if we do need the support.

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

In general when it's a clear case we don't support, we'll throw an error

}
}
} else {
// For this category of ops, we need to
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Also, when you mean "this category" of ops, do you mean operations that have multiple inputs or outputs?.

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

no, this is for ops that we quantize by propagating dequantize ops, e.g. flatten

Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

no, this is for ops that we quantize by propagating dequantize ops, e.g. flatten

maybe we can include this in the comment?

…with multiple outputs"

Summary:
e.g. prim::ListUnpack

Test Plan:

Reviewers:

Subscribers:

Tasks:

Tags:

Differential Revision: [D21942108](https://our.internmc.facebook.com/intern/diff/D21942108)

[ghstack-poisoned]
…with multiple outputs"

Summary:
e.g. prim::ListUnpack

Test Plan:

Reviewers:

Subscribers:

Tasks:

Tags:

Differential Revision: [D21942108](https://our.internmc.facebook.com/intern/diff/D21942108)

[ghstack-poisoned]
@facebook-github-bot
Copy link
Copy Markdown
Contributor

This pull request has been merged in 2a06a69.

@facebook-github-bot facebook-github-bot deleted the gh/jerryzh168/335/head branch June 13, 2020 14:16
laurentdupin pushed a commit to laurentdupin/pytorch that referenced this pull request Apr 24, 2026
…le outputs (pytorch#39551)

Summary:
Pull Request resolved: pytorch#39551

e.g. prim::ListUnpack

Test Plan: Imported from OSS

Differential Revision: D21942108

fbshipit-source-id: 6fd7c972ca70692ec52c296b6a1e858324e66c12
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Merged oncall: jit Add this issue/PR to JIT oncall triage queue

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants