Skip to content

Fuse Pad op before Conv2D op into Conv op as an attribute#334

Merged
pengwa merged 2 commits intoonnx:masterfrom
lucienwang1009:fuse_pad
Mar 2, 2019
Merged

Fuse Pad op before Conv2D op into Conv op as an attribute#334
pengwa merged 2 commits intoonnx:masterfrom
lucienwang1009:fuse_pad

Conversation

@lucienwang1009
Copy link
Copy Markdown
Collaborator

@lucienwang1009 lucienwang1009 commented Feb 28, 2019

@guschmue For resnet-50, three Pads are fused into Conv.

See also #103 #201

@pengwa pengwa requested a review from guschmue February 28, 2019 14:56
paddings = pad.inputs[1]

if not paddings.is_const():
return ops
Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

should use continue instead of return ?

Copy link
Copy Markdown
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yes, my fault.

if mode:
mode = mode.s.decode("utf-8").lower()
if mode not in [None, "constant"] or len(pad.input) >= 3:
return ops
Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

same here

for match in match_results:
conv = match.get_op("conv")
pad = match.get_op("pad")
paddings = pad.inputs[1]
Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

any bad impact if padding node is used more than once?

Copy link
Copy Markdown
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Here, I keep the Pad op intact and change the input of following Conv2D by replace_input. It won't affect other nodes using this Pad op.

paddings_val = paddings_val[1:3]
paddings_val = paddings_val.transpose().flatten()
g.replace_input(conv, conv.input[0], pad.input[0])
# convert Conv2D
Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

if you want to do conversion here, I think you should also consider opset check.

Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

shall we consider to do the fusion in the later rewriter? then we won't have the trick to call mapping function in this stage.

Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

or it can even be part of the optimizer.

Copy link
Copy Markdown
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Have thought of this. but then we will need to handle the case in which converted Pad and Conv nodes are separated by Cast and Transpose nodes inserted by the converter.

Copy link
Copy Markdown
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

if you want to do conversion here, I think you should also consider opset check.

Talk with @pengwa offline. We decide not to check OPSET here, as Conv op has no change since OPSET 1.
But we find many pre-rewriters create ONNX nodes without doing this check. It will screw up if we use nodes of the wrong version in these rewriters.

Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Not checking sounds correct. I guess if in a future opset something changes we'll depend on unittests to remind us.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants