Skip to content

Improve error message for Pad operator#39651

Closed
yaeldekel wants to merge 9 commits intopytorch:masterfrom
yaeldekel:pad
Closed

Improve error message for Pad operator#39651
yaeldekel wants to merge 9 commits intopytorch:masterfrom
yaeldekel:pad

Conversation

@yaeldekel
Copy link
Copy Markdown

In issue #36997 the user encountered a non-meaningful error message when trying to export the model to ONNX. The Pad operator in opset 9 requires the list of paddings to be constant. This PR tries to improve the error message given to the user when this is not the case.

@dr-ci
Copy link
Copy Markdown

dr-ci Bot commented Jun 8, 2020

💊 CI failures summary and remediations

As of commit 3046a00 (more details on the Dr. CI page):


💚 💚 Looks good so far! There are no failures yet. 💚 💚


This comment was automatically generated by Dr. CI (expand for details).Follow this link to opt-out of these comments for your Pull Requests.

Please report bugs/suggestions on the GitHub issue tracker or post in the (internal) Dr. CI Users group.

See how this bot performed.

This comment has been revised 52 times.

Comment thread torch/onnx/symbolic_opset9.py Outdated
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks, what's the error message in this case? Does it clarify the non-Constant issue?

@ngimel ngimel requested a review from houseroad June 10, 2020 21:46
@ngimel ngimel added the triaged This issue has been looked at a team member, and triaged and prioritized into an appropriate module label Jun 10, 2020
Copy link
Copy Markdown
Contributor

@facebook-github-bot facebook-github-bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@houseroad has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.

Copy link
Copy Markdown
Contributor

@neginraoof neginraoof left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks! Can you add a test to cover this use-case?

Comment thread torch/onnx/symbolic_opset9.py Outdated
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

My understanding is that padding arg in reflecion_pad and replication_pad ops is always an array of ints.
Could padding be a single int?
Also, can you use _maybe_get_const() function from symbolic_helper to parse 'is'?
If the output is still a torch._C.Value, then throw the exception?

Copy link
Copy Markdown
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You are correct. The padding arg has to be an array of ints, which can be represented either in an onnx::Constant node containing a tensor value, or a prim::ListConstruct node.


In reply to: 441648471 [](ancestors = 441648471)

Copy link
Copy Markdown
Contributor

@neginraoof neginraoof Jun 22, 2020

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sorry I meant sym_helper._get_const here.
Maybe something like:

    try:
        padding = sym_help._get_const(padding, 'is', 'constant_pad_nd')
    except:
        _unimplemented('constant_pad_nd', "<msg>")

Comment thread torch/onnx/symbolic_opset9.py Outdated
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

can you simplify the code by using _unpack_list: https://github.com/pytorch/pytorch/blob/master/torch/onnx/symbolic_helper.py#L109
to get the list values from prim::ListConstruct?

Copy link
Copy Markdown
Contributor

@neginraoof neginraoof left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks! Are there any related failures on CI?

Comment thread torch/onnx/symbolic_opset9.py Outdated
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sorry I should've asked for sym_help._get_const here.
This one throws an error if all list elements are not const.
Maybe something like:

 try:
        padding = sym_help._get_const(padding, 'is', 'constant_pad_nd')
    except:
        _unimplemented('constant_pad_nd', "<msg>")

Copy link
Copy Markdown
Author

@yaeldekel yaeldekel Jun 23, 2020

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think sym_help._get_const won't work in this case, because this is the condition that it checks (it throws an exception when it is true):

_is_value(value) and value.node().kind() not in ('onnx::Constant', 'prim::Constant')

and in the padding case, it can also be prim::ListConstruct.

But I did use your suggestion below for parsing the value in constant_pad_nd.


In reply to: 443761593 [](ancestors = 443761593)

Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

How about after you unpack the list? Maybe like:

try:
    padding = [sym_help._get_const(v, 'is', 'constant_pad_nd') for v in input_list]
except:
    _unimplemented('constant_pad_nd', "<msg>")

It checks for prim:Constant type as well, also use of _unimplemented is more consistent.
I'm trying to avoid custom arg parsing or error handling for symbolics.

Comment thread torch/onnx/symbolic_opset9.py Outdated
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This looks okay for now. If see this type of error handling (per opset) repeating more often, we should handle it in a separate function to make it more generic.

@yaeldekel
Copy link
Copy Markdown
Author

cc @houseroad for review. Thanks!

Copy link
Copy Markdown
Member

@houseroad houseroad left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LG, thanks

Copy link
Copy Markdown
Contributor

@facebook-github-bot facebook-github-bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@houseroad has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.

@facebook-github-bot
Copy link
Copy Markdown
Contributor

@houseroad merged this pull request in 6e4f501.

csarofeen pushed a commit to csarofeen/pytorch that referenced this pull request Jul 7, 2020
Summary:
In issue pytorch#36997 the user encountered a non-meaningful error message when trying to export the model to ONNX. The Pad operator in opset 9 requires the list of paddings to be constant. This PR tries to improve the error message given to the user when this is not the case.

Pull Request resolved: pytorch#39651

Reviewed By: hl475

Differential Revision: D21992262

Pulled By: houseroad

fbshipit-source-id: b817111c2a40deba85e4c6cdb874c1713312dba1
csarofeen added a commit to csarofeen/pytorch that referenced this pull request Aug 16, 2020
laurentdupin pushed a commit to laurentdupin/pytorch that referenced this pull request Apr 24, 2026
Summary:
In issue pytorch#36997 the user encountered a non-meaningful error message when trying to export the model to ONNX. The Pad operator in opset 9 requires the list of paddings to be constant. This PR tries to improve the error message given to the user when this is not the case.

Pull Request resolved: pytorch#39651

Reviewed By: hl475

Differential Revision: D21992262

Pulled By: houseroad

fbshipit-source-id: b817111c2a40deba85e4c6cdb874c1713312dba1
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Merged open source triaged This issue has been looked at a team member, and triaged and prioritized into an appropriate module

Projects

None yet

Development

Successfully merging this pull request may close these issues.

9 participants