DNN: add ONNX where node support#23485
Conversation
| } | ||
|
|
||
| template <typename T, typename Functor> | ||
| void trinary_forward(const Functor& f, const std::vector<Mat>& inputs, std::vector<Mat>& outputs) |
There was a problem hiding this comment.
To be honest it looks like we could change the original function to accept multiple inputs.
E.g. make it accept (data + step) buffer. And everywhere where there is $i suffix (e.g. ptr1_, etc), we could use buffer as well.
I think we had performance tests somewhere, we should check that it didn't degrade. If it does, we could have 1) fast binary path and 2) slower naray path.
There was a problem hiding this comment.
Hi, @rogday. Thanks for the code review.
I'm not very familiar with this part of the code, can you give some examples in more detail so I can complete it?
|
@dkurt Could you, please, review the PR too? |
| TEST_P(Test_ONNX_layers, where_node) | ||
| { | ||
| testONNXModels("where_layer"); | ||
| } |
There was a problem hiding this comment.
Please add a test for broadcasting scenario.
There was a problem hiding this comment.
Hi @dkurt. Thanks for your code review, and I have updated the test case.
|
@zihaomu |
|
Link to #23470 (comment) |

Merge with test data: opencv/opencv_extra#1054
Pull Request Readiness Checklist
See details at https://github.com/opencv/opencv/wiki/How_to_contribute#making-a-good-pull-request
Patch to opencv_extra has the same branch name.