Fuse Pad op before Conv2D op into Conv op as an attribute#334
Fuse Pad op before Conv2D op into Conv op as an attribute#334pengwa merged 2 commits intoonnx:masterfrom
Conversation
tf2onnx/tfonnx.py
Outdated
| paddings = pad.inputs[1] | ||
|
|
||
| if not paddings.is_const(): | ||
| return ops |
There was a problem hiding this comment.
should use continue instead of return ?
There was a problem hiding this comment.
yes, my fault.
tf2onnx/tfonnx.py
Outdated
| if mode: | ||
| mode = mode.s.decode("utf-8").lower() | ||
| if mode not in [None, "constant"] or len(pad.input) >= 3: | ||
| return ops |
| for match in match_results: | ||
| conv = match.get_op("conv") | ||
| pad = match.get_op("pad") | ||
| paddings = pad.inputs[1] |
There was a problem hiding this comment.
any bad impact if padding node is used more than once?
There was a problem hiding this comment.
Here, I keep the Pad op intact and change the input of following Conv2D by replace_input. It won't affect other nodes using this Pad op.
| paddings_val = paddings_val[1:3] | ||
| paddings_val = paddings_val.transpose().flatten() | ||
| g.replace_input(conv, conv.input[0], pad.input[0]) | ||
| # convert Conv2D |
There was a problem hiding this comment.
if you want to do conversion here, I think you should also consider opset check.
There was a problem hiding this comment.
shall we consider to do the fusion in the later rewriter? then we won't have the trick to call mapping function in this stage.
There was a problem hiding this comment.
or it can even be part of the optimizer.
There was a problem hiding this comment.
Have thought of this. but then we will need to handle the case in which converted Pad and Conv nodes are separated by Cast and Transpose nodes inserted by the converter.
There was a problem hiding this comment.
if you want to do conversion here, I think you should also consider opset check.
Talk with @pengwa offline. We decide not to check OPSET here, as Conv op has no change since OPSET 1.
But we find many pre-rewriters create ONNX nodes without doing this check. It will screw up if we use nodes of the wrong version in these rewriters.
There was a problem hiding this comment.
Not checking sounds correct. I guess if in a future opset something changes we'll depend on unittests to remind us.
@guschmue For resnet-50, three Pads are fused into Conv.
See also #103 #201