Skip to content

Model Optimizer errors when trying to optimize official Tensorflow implementation of Resnet model #11

@stefan-andritoiu

Description

@stefan-andritoiu

TF Model repo used: https://github.com/tensorflow/models/tree/master/official/resnet
Observation: The implementation uses the Estimator framework. A saved_model of the inference graph is exported.

Trying to optimize a saved model:
Model Optimizer command used:
python3 /opt/intel/computer_vision_sdk/deployment_tools/model_optimizer/mo_tf.py --saved_model_dir <model_dir>

Error:
Model Optimizer version: 1.2.185.5335e231
[ ERROR ] Cannot infer shapes or values for node "images".
[ ERROR ] 'bytes' object has no attribute 'shape'
[ ERROR ]
[ ERROR ] It can happen due to bug in custom shape infer function <function tf_native_tf_node_infer at 0x7fddbbbba488>.
[ ERROR ] Or because the node inputs have incorrect values/shapes.
[ ERROR ] Or because input shapes are incorrect (embedded to the model or passed via --input_shape).
[ ERROR ] Run Model Optimizer with --log_level=DEBUG for more information.
[ ERROR ] Stopped shape/value propagation at "images" node.
For more information please refer to Model Optimizer FAQ (<INSTALL_DIR>/deployment_tools/documentation/docs/MO_FAQ.html), question #38.

Cause (in TF Python model implementation):
...
#Generate a summary node for the images
tf.summary.image('images', features, max_outputs=6)
...

FIX: After removing the above tf.summary.image statement, the same command was used and the following asserts were triggered:

1st one: /opt/intel/computer_vision_sdk_2018.3.343/deployment_tools/model_optimizer/mo/ops/op.py:173

assert all(old_shape is None for old_shape in old_data_shape) or all([np.array_equal(old_data_shape[id], data_node.shape) for id, data_node in enumerate(data_nodes)])
Commented it, then the 2nd assert followed:

2nd one: /opt/intel/computer_vision_sdk_2018.3.343/deployment_tools/model_optimizer/mo/middle/passes/fusing/fuse_linear_seq.py:81

assert (np.array_equal(fnodes[0].in_node(get_tensor_id(fnodes[0])).shape, fnodes[-1].out_node().shape))

Commented the assert and then the process finished with a
[SUCCESS] Generated IR model
And the generated .bin is NOT empty, as has happened with some errors.
But the inference engine fails while loading the network (Error: Segmentation Fault).

Metadata

Metadata

Assignees

No one assigned

    Labels

    category: MOModel OptimizerquestionFurther information is requested

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions