dnn: update network dump code, include ngraph serialization#17388
dnn: update network dump code, include ngraph serialization#17388opencv-pushbot merged 1 commit intoopencv:3.4from
Conversation
|
|
||
| if (DNN_IE_SERIALIZE) | ||
| { | ||
| #ifndef OPENCV_DNN_DISABLE_NETWORK_AUTO_DUMP |
There was a problem hiding this comment.
Where this is defined? And what is default value?
There was a problem hiding this comment.
It is not defined anywhere in OpenCV, but can be passed through build flags.
Purpose is custom builds which want to turn off network dumps.
There was a problem hiding this comment.
I just thought that AUTO_DUMP means to dump networks by default, without flags. If it's build flag, that means that if OPENCV_DNN_DISABLE_NETWORK_AUTO_DUMP is not specified but OPENCV_DNN_IE_SERIALIZE is ON we cannot dump the network?
There was a problem hiding this comment.
Naming is aligned with this code.
Idea is compile-time blocker to dump/serialize network.
There was a problem hiding this comment.
So, basically my motivation is to have this option available in OpenCV from OpenVINO distribution. Sometimes to enable models in OpenVINO it's helpful to have this method. Will this build flag be available in OpenVINO build by default or we can miss it?
There was a problem hiding this comment.
This code is enabled by default, including OpenVINO builds.
There was a problem hiding this comment.
Oh, I see now, sorry. #ifndef and DISABLE. Thanks!
| std::string dumpFileNameBase = netImpl_.getDumpFileNameBase(); | ||
| try | ||
| { | ||
| cnn.serialize(dumpFileNameBase + "_ngraph.xml", dumpFileNameBase + "_ngraph.bin"); |
There was a problem hiding this comment.
BTW, dumping network twice doesn't work (at least for YOLOv3 tests):
cnn.serialize(dumpFileNameBase + "_ngraph.xml", dumpFileNameBase + "_ngraph.bin");
cnn.serialize(dumpFileNameBase + "_ngraph2.xml", dumpFileNameBase + "_ngraph2.bin");
Error message is:
Exception: Broken edge form layer Multiply_169520 to layer Divide_143502during serialization of IR
There was a problem hiding this comment.
Looks like a bug in Inference Engine. Have you checked an accuracy after a single serialize? Maybe it breaks even inference.
|
Faced the following error: when Update: Oh, I see. This is boolean, output |
relates #17382