make tests deterministic and add TODO to fix state dict#49
Merged
andersonic merged 5 commits intomasterfrom Aug 21, 2020
Merged
make tests deterministic and add TODO to fix state dict#49andersonic merged 5 commits intomasterfrom
andersonic merged 5 commits intomasterfrom
Conversation
Codecov Report
@@ Coverage Diff @@
## master #49 +/- ##
=======================================
Coverage 94.18% 94.18%
=======================================
Files 35 35
Lines 2065 2065
=======================================
Hits 1945 1945
Misses 120 120
Flags with carried forward coverage won't be shown. Click here to find out more.
Continue to review full report at Codecov.
|
added 2 commits
August 21, 2020 09:21
…the cases that are known to be broken
Contributor
Author
|
8/21 9:40 AM NOT READY FOR REVIEW |
Contributor
Author
|
8/21 10:10AM READY |
msbaines
approved these changes
Aug 21, 2020
blefaudeux
pushed a commit
that referenced
this pull request
Aug 21, 2020
Set the torch seed for tests. xfail mixed precision and memory-efficient mixed-precision state_dict tests due to their states being cast to FP16 and back to FP32 during load_state_dict. Co-authored-by: Jun Ru Anderson <andersonic@fb.com>
blefaudeux
added a commit
that referenced
this pull request
Aug 21, 2020
* initial commit, dummy training loop, pure pytorch but not DDP * probably slightly broken, but rough DDP benchmark run * adding the torchvision requirement for testing * brainfart * reduce the loss, do something slightly distributed * Some cleanup, distributing the training on two GPUs * some cleanup + adding a vanilla run, still not good to go * less silly defaults, gtg for a start I think * smaller batch to fit the smaller gpus used in the circleci rigs * Adding some options for the benchmark, and regression testing * [test] set torch seed for Adam tests (#49) Set the torch seed for tests. xfail mixed precision and memory-efficient mixed-precision state_dict tests due to their states being cast to FP16 and back to FP32 during load_state_dict. Co-authored-by: Jun Ru Anderson <andersonic@fb.com> * linting, I really need to automate this isort insanity Co-authored-by: Jun Ru Anderson <33384298+andersonic@users.noreply.github.com> Co-authored-by: Jun Ru Anderson <andersonic@fb.com>
myleott
pushed a commit
that referenced
this pull request
Feb 22, 2021
* Test backward hooks are registered * expand * fs_test * passing * assert again * add assert not called * naming
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Before submitting
What does this PR do?
Fixes # (issue).
PR review
Anyone in the community is free to review the PR once the tests have passed.
If we didn't discuss your PR in Github issues there's a high chance it will not be merged.
Did you have fun?
Make sure you had fun coding 🙃