OK, I tried to make it short.
To recreate the issue, do as follows:
- Download the public pre-trained Resnet-50 v2 model from tensorflow git repo, select the one with ResNet-50 v2, fp32 , savedModel NHWC
- Use
tensorflow.python.tools.freeze_graph to export a frozen graph from savedModel (see instruction here). Here is the command I used to create frozen graph from savedModel
bazel-bin/tensorflow/python/tools/freeze_graph \
--input_saved_model_dir="/root/models/resnet50/tensorflow/resnet_v2_fp32_savedmodel_NHWC/1538687283" \
--output_graph="/root/models/resnet50/tensorflow/frozen_graph/resnet50_v2_frozen_graph.pb" \
--output_node_names="softmax_tensor"
For your convenience here is my pregenerated frozen graph
- Now use tf2onnx.convert to convert the frozen graph from step (2) to onnx format.
Here is the command I used to create onnx from frozen graph
INPUT_FROZEN_GRAPH_PB="/root/models/resnet50/tensorflow/frozen_graph/resnet50_v2_frozen_graph.pb"
OUTPUT_ONNX_FILE="/root/models/resnet50/onnx/resnet50_v2.onnx"
INPUTS="input_tensor:0"
OUTPUTS="softmax_tensor:0"
# --inputs-as-nchw $INPUTS # It doesn't make any difference with this line regarding this issue.
python3 -m tf2onnx.convert \
--input $INPUT_FROZEN_GRAPH_PB \
--inputs $INPUTS \
--outputs $OUTPUTS \
--output $OUTPUT_ONNX_FILE \
--opset 7 \
--verbose
For your convenience here is my pregenerated onnx.
- Use
onnx2trt from onnx-tensort to load and parse the onnx model from (3). Then you will see the error message:
Parsing ONNX file...
----------------------------------------------------------------
Input filename: /root/models/resnet50/onnx/resnet50_v2.onnx
ONNX IR version: 0.0.3
Opset version: 7
Producer name: tf2onnx
Producer version: 0.4.0
Domain:
Model version: 0
Doc string:
----------------------------------------------------------------
While parsing node number 5 [Pad -> "resnet_model/Pad:0"]:
ERROR: /home/erisuser/p4sw/sw/gpgpu/MachineLearning/DIT/release/5.0/parsers/onnxOpenSource/builtin_op_importers.cpp:1135 In function importPad:
[8] Assertion failed: onnx_padding[0] == 0 && onnx_padding[1] == 0 && onnx_padding[4] == 0 && onnx_padding[5] == 0
ERROR: failed to parse onnx file
Thoughts
I visualized graph topologies from both frozen graph and its onnx couterparts. Apparently, the padding informations (i.e. padding 3 pixels at left, right, top and bottom of input image) get screwed up in step (3)
Here is what the padding node looks like in the original frozen graph

And here is what it looks like after conversion in step (3)

OK, I tried to make it short.
To recreate the issue, do as follows:
tensorflow.python.tools.freeze_graphto export a frozen graph from savedModel (see instruction here). Here is the command I used to create frozen graph from savedModelFor your convenience here is my pregenerated frozen graph
Here is the command I used to create onnx from frozen graph
For your convenience here is my pregenerated onnx.
onnx2trtfrom onnx-tensort to load and parse the onnx model from (3). Then you will see the error message:Thoughts
I visualized graph topologies from both frozen graph and its onnx couterparts. Apparently, the padding informations (i.e. padding 3 pixels at left, right, top and bottom of input image) get screwed up in step (3)
Here is what the padding node looks like in the original frozen graph
And here is what it looks like after conversion in step (3)
