Conversation
Signed-off-by: andife <fehlner@arcor.de>
|
Re: Yes, it would be very useful. Just to clarify: there is already a doc file here: https://github.com/onnx/onnx/blob/main/docs/AddNewOp.md ... if you have suggestions for what new stuff to add, that would be welcome, thanks! |
|
You need to generate the test-case data and include it in the PR. Eg., see https://github.com/onnx/onnx/blob/main/tools/update_doc.sh for the steps to generate the updated files. |
Signed-off-by: andife <fehlner@arcor.de>
I wonder why also Do I've to commit than, too? |
No need to commit other unrelated input.pb/output.pb. They were updated because different platforms might behave differently for input/output generation. Please just ignore them. Thanks! |
| .Output(0, "Y", "Output tensor", "T", OpSchema::Single, true, 1, OpSchema::Differentiable) | ||
| .TypeConstraint( | ||
| "T", | ||
| {"tensor(float16)", "tensor(float)", "tensor(double)"}, |
There was a problem hiding this comment.
Wondering whether we should also include bfloat16 as an allowed type. Any reasons/concerns for or against this? Thanks!
There was a problem hiding this comment.
Never mind. It looks like Softplus doesn't support bfloat16. Better to add bfloat16 support to both ops at same time.
| input_data = np.array([[[[0.8439683], [0.5665144], [0.05836735]], | ||
| [[0.02916367], [0.12964272], [0.5060197]], | ||
| [[0.79538304], [0.9411346], [0.9546573]]], | ||
| [[[0.17730942], [0.46192095], [0.26480448]], | ||
| [[0.6746842], [0.01665257], [0.62473077]], | ||
| [[0.9240844], [0.9722341], [0.11965699]]], | ||
| [[[0.41356155], [0.9129373], [0.59330076]], | ||
| [[0.81929934], [0.7862604], [0.11799799]], | ||
| [[0.69248444], [0.54119414], [0.07513223]]]], dtype=np.float32) |
There was a problem hiding this comment.
Some negative values could be included in the test, just in case. It's up to you, I don't think it's absolutely necessary.
Signed-off-by: andife <fehlner@arcor.de>
* add mish as function Signed-off-by: andife <fehlner@arcor.de> * Fix opset 17 -> 18 Signed-off-by: andife <fehlner@arcor.de> * gen doc Signed-off-by: andife <fehlner@arcor.de> * #clang Signed-off-by: andife <fehlner@arcor.de> * add negative values to test case Signed-off-by: andife <fehlner@arcor.de> * fix flake Signed-off-by: andife <fehlner@arcor.de> * #DC flake Signed-off-by: andife <fehlner@arcor.de>
Description
Introduce the mish activation function
Motivation and Context
Mish is used in YoloV4
Mish was requested here:
#3475 and #2818
It is implemented in
It can be calculated using the following formula:
mish(x) = x * tanh(softplus(x)) = x * tanh(ln(1 + e^{x}))
Note
It's my first function and I've tried to use other implementations as a guide. If you have any suggestions on how to improve the proposal or what I missed, please let me know.
I wonder if it would be useful / helpful to create a small documentation with technical details. "How to implement a new operator or function". How to deal with defs.cc ? How to create attributes.
Signed-off-by: andife fehlner@arcor.de