Native ONNX to Inference Engine backend#21066
Conversation
|
@alalek @andrewerf Is the PR still relevant? Do you plan to finish it? |
|
@asmorkalov just checked the background & related issue and it seems it is not coming from the intel or openvino end -- so I can't help here. If the DNN module itself doesn't need this functionality, I believe the issue may be closed. cc: @TolyaTalamanov probably you'd like to consider adding the direct ONNX support in your OV2.0 backend -- FYI |
dmatveev
left a comment
There was a problem hiding this comment.
No objections here, we may adopt the same way in G-API to load ONNX models directly into OV.
dmatveev
left a comment
There was a problem hiding this comment.
No objections here, we may adopt the same way in G-API to load ONNX models directly into OV.
Overloaded readNetFromONNX taking backendId parameter. Added static member function Net::readFromONNX. More overloads in order to provide better interface for reading onnx models with IE. Fixed build without IR (CV_UNUSED)
f6cd760 to
52234eb
Compare
As far as I understood from openvino doc: https://docs.openvino.ai/2022.3/classov_1_1Core.html So, based on our current implementation: https://github.com/opencv/opencv/blob/4.x/modules/gapi/include/opencv2/gapi/infer/ov.hpp#L138 User can use |
opencv-alalek
left a comment
There was a problem hiding this comment.
@dkurt Thank you for the update!
| */ | ||
| CV_EXPORTS_W | ||
| Net readNetFromModelOptimizer(const String &xml, const String &bin); | ||
| Net readNetFromModelOptimizer(const String &xml, const String &bin = ""); |
There was a problem hiding this comment.
= String() / std::string()
to avoid processing of unnecessary extra buffers
(other APIs have similar problem so it is better to resolve that in a separate PR)
…onnx Native ONNX to Inference Engine backend opencv#21066 Resolves opencv#21052 ### Pull Request Readiness Checklist See details at https://github.com/opencv/opencv/wiki/How_to_contribute#making-a-good-pull-request - [x] I agree to contribute to the project under Apache 2 License. - [x] To the best of my knowledge, the proposed patch is not based on a code under GPL or other license that is incompatible with OpenCV - [x] The PR is proposed to proper branch - [x] There is reference to original bug report and related work - [ ] There is accuracy test, performance test and test data in opencv_extra repository, if applicable - [ ] The feature is well documented and sample code can be built with the project CMake
…onnx Native ONNX to Inference Engine backend opencv#21066 Resolves opencv#21052 ### Pull Request Readiness Checklist See details at https://github.com/opencv/opencv/wiki/How_to_contribute#making-a-good-pull-request - [x] I agree to contribute to the project under Apache 2 License. - [x] To the best of my knowledge, the proposed patch is not based on a code under GPL or other license that is incompatible with OpenCV - [x] The PR is proposed to proper branch - [x] There is reference to original bug report and related work - [ ] There is accuracy test, performance test and test data in opencv_extra repository, if applicable - [ ] The feature is well documented and sample code can be built with the project CMake
…onnx Native ONNX to Inference Engine backend opencv#21066 Resolves opencv#21052 ### Pull Request Readiness Checklist See details at https://github.com/opencv/opencv/wiki/How_to_contribute#making-a-good-pull-request - [x] I agree to contribute to the project under Apache 2 License. - [x] To the best of my knowledge, the proposed patch is not based on a code under GPL or other license that is incompatible with OpenCV - [x] The PR is proposed to proper branch - [x] There is reference to original bug report and related work - [ ] There is accuracy test, performance test and test data in opencv_extra repository, if applicable - [ ] The feature is well documented and sample code can be built with the project CMake
resolves #21052
Pull Request Readiness Checklist
See details at https://github.com/opencv/opencv/wiki/How_to_contribute#making-a-good-pull-request