System information (version)
- OpenCV => 4.1
- Operating System / Platform => Arch-Linux 64 Bit, Kernel 5.1.15-arch1-1-ARCH
- Compiler => cmake + ninja
Detailed description
I am trying to use the tf_text_graph_ssd.py script to generate a .pbtxt textgraph file from a tensorflow frozen_inference_graph.pb file.
At first, I got the same exception that was mentioned in this issue and I applied the fix. But then I got another exception:
Traceback (most recent call last):
File "/scripts/tf_text_graph_ssd.py", line 377, in <module>
createSSDGraph(args.input, args.config, args.output)
File "/scripts/tf_text_graph_ssd.py", line 261, in createSSDGraph
addConstNode('concat/axis_flatten', [-1], graph_def)
File "/scripts/tf_text_graph_common.py", line 114, in addConstNode
graph_def.node.extend([node])
TypeError: Not a cmessage
Steps to reproduce
- (I can not provide my dataset that I used for training, sorry)
- Use the latest tensorflow models+scripts from master
- Use the latest
tf_text_graph scripts from this repo.
- I trained the ssdlite mobilenet v2, here is my dnn.cfg
- I used this pre-trained-model
- You can create the
frozen_inference_graph.pb file yourself by running
python /scripts/models/research/object_detection/model_main.py \
--pipeline_config_path="dnn.cfg" \
--model_dir="models"
python /scripts/models/research/object_detection/export_inference_graph.py \
--input_type=image_tensor \
--pipeline_config_path="dnn.cfg" \
--trained_checkpoint_prefix="models/model.ckpt-100" \
--output_directory="export"
or you use my frozen_inference_graph.pb
6. Then use this updated tf_text_graph_ssd.py (I already applied the fix from #11560 (comment)) and execute this command:
python tf_text_graph_ssd.py \
--input "export/frozen_inference_graph.pb" \
--config "dnn.cfg" \
--output "model.pbtxt"
- You will receive the stacktrace mentioned above
System information (version)
Detailed description
I am trying to use the
tf_text_graph_ssd.pyscript to generate a.pbtxttextgraph file from a tensorflowfrozen_inference_graph.pbfile.At first, I got the same exception that was mentioned in this issue and I applied the fix. But then I got another exception:
Steps to reproduce
tf_text_graphscripts from this repo.frozen_inference_graph.pbfile yourself by runningor you use my frozen_inference_graph.pb
6. Then use this updated tf_text_graph_ssd.py (I already applied the fix from #11560 (comment)) and execute this command: