Update Dropout and BatchNorm to be Training Friendly#2568
Merged
wschin merged 23 commits intoonnx:masterfrom Feb 6, 2020
Merged
Update Dropout and BatchNorm to be Training Friendly#2568wschin merged 23 commits intoonnx:masterfrom
wschin merged 23 commits intoonnx:masterfrom
Conversation
Contributor
Author
|
@SherlockNoMad @wschin for review |
wschin
reviewed
Jan 27, 2020
wschin
reviewed
Jan 27, 2020
wschin
reviewed
Jan 27, 2020
wschin
reviewed
Jan 27, 2020
wschin
reviewed
Jan 28, 2020
wschin
reviewed
Jan 29, 2020
wschin
reviewed
Jan 29, 2020
wschin
reviewed
Jan 29, 2020
wschin
reviewed
Jan 29, 2020
wschin
reviewed
Jan 29, 2020
added 3 commits
January 29, 2020 11:23
wschin
reviewed
Jan 29, 2020
wschin
reviewed
Jan 29, 2020
wschin
reviewed
Jan 29, 2020
Contributor
Author
|
@gramalingam for review |
gramalingam
reviewed
Jan 30, 2020
gramalingam
reviewed
Jan 30, 2020
Contributor
Author
gramalingam
approved these changes
Jan 31, 2020
This was referenced Feb 3, 2020
…a-hdr/onnx into lahaidar/update_training_ops
Contributor
Author
|
@ebarsoum CI is green. Thanks! |
facebook-github-bot
pushed a commit
to pytorch/pytorch
that referenced
this pull request
Mar 27, 2020
Summary: - Update Dropout and Batchnorm in opset 12 : onnx/onnx#2568 - Update api logic for exporting to ONNX training amenable models Pull Request resolved: #32950 Reviewed By: hl475 Differential Revision: D19710370 Pulled By: houseroad fbshipit-source-id: e5e79d38552936966662c41d39ddf33be1ba3e35
lara-hdr
added a commit
to lara-hdr/pytorch
that referenced
this pull request
Mar 27, 2020
) Summary: - Update Dropout and Batchnorm in opset 12 : onnx/onnx#2568 - Update api logic for exporting to ONNX training amenable models Pull Request resolved: pytorch#32950 Reviewed By: hl475 Differential Revision: D19710370 Pulled By: houseroad fbshipit-source-id: e5e79d38552936966662c41d39ddf33be1ba3e35
This was referenced Apr 25, 2020
Closed
jcwchen
pushed a commit
to jcwchen/onnx
that referenced
this pull request
Sep 23, 2020
* Update Dropout and BatchNorm to be Training Friendly * fix test name * update ref implementation * merge with master and re-generate docs * fix eliminate dropout test * missing type annotation * update doc + shape inference * update doc * re-gen doc * update doc * update doc * fxitest * add hasInputShape check * rename outputs + update doc * static_cast for stricter CI Co-authored-by: Wei-Sheng Chin <wechi@microsoft.com> Co-authored-by: Emad Barsoum <ebarsoum@gmail.com>
laurentdupin
pushed a commit
to laurentdupin/pytorch
that referenced
this pull request
Apr 24, 2026
) Summary: - Update Dropout and Batchnorm in opset 12 : onnx/onnx#2568 - Update api logic for exporting to ONNX training amenable models Pull Request resolved: pytorch#32950 Reviewed By: hl475 Differential Revision: D19710370 Pulled By: houseroad fbshipit-source-id: e5e79d38552936966662c41d39ddf33be1ba3e35
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Initial PR: #1887
This PR updates Dropout and BatchNormalization to be training friendly.
For Dropout:
The "ratio" is now an input rather than an attribute of the operator, and a new attribute "seed" is introduced.
For BatchNormalization:
The operator already have support for the mean and var as inputs/outputs and saved_mean and saved_var as output of the model. The input mean and var will be the running values if we are in training mode, otherwise they will be the estimated values. The optional outputs mean/var/saved_mean/saved_var will only be used in training mode.
A new optional input "training_mode" is introduced and defaults to False since in most cases we would export ONNX models in inference mode; this is an input rather than an attribute since we need to modify it during the runtime.
The new attribute "training_mode" would allow to state explicitly if the operator is in training or inference mode. However the backend engine running the ONNX model could infer this information from the ONNX model, so we could potentially remove this attribute completely, and let the engine decide how to compute the output of this operator.