add selu activation function#10889
Conversation
| } | ||
| else if (type == "Selu") | ||
| { | ||
| layerParams.set("scale", 1.0507009873554804934193349852946f); |
There was a problem hiding this comment.
Could you please provide an origin of these magic values? Are they results of some optimization problem that we can reproduce locally or they are just some irrational numbers that have too complicated formula? This way it'd be nice to put a reference to a some proof that they are optimal.
|
I have put a link to the paper in the linked issue. Here it is again: https://arxiv.org/abs/1706.02515
|
|
Sure, no problem.
|
|
Please look at custom layers registration: https://docs.opencv.org/master/dc/db1/tutorial_dnn_custom_layers.html. |
Both pytorch, keras and tensorflow has native support for SELU, which is a strong evident that this activation layer should be popular enough for opencv to support it. P.S. I would be grateful if someone can provide a custom layer implementation. |
resolves #10888
This pullrequest changes
I have not yet added the test data and request the devs to guide me on how to do it.