Swav#239
Conversation
|
Hello @ananyahjha93! Thanks for updating this PR. There are currently no PEP 8 issues detected in this Pull Request. Cheers! 🍻 Comment last updated at 2020-10-19 16:09:54 UTC |
Codecov Report
@@ Coverage Diff @@
## master #239 +/- ##
==========================================
- Coverage 83.91% 82.06% -1.86%
==========================================
Files 91 97 +6
Lines 4861 5441 +580
==========================================
+ Hits 4079 4465 +386
- Misses 782 976 +194
Flags with carried forward coverage won't be shown. Click here to find out more.
Continue to review full report at Codecov.
|
|
|
||
| if prob < self.p: | ||
| sigma = (self.max - self.min) * np.random.random_sample() + self.min | ||
| sample = cv2.GaussianBlur(sample, (self.kernel_size, self.kernel_size), sigma) |
There was a problem hiding this comment.
| ) | ||
|
|
||
| trainer = pl.Trainer( | ||
| gpus=0, fast_dev_run=False, max_epochs=1, default_root_dir=tmpdir, max_steps=3 |
There was a problem hiding this comment.
Can we at least do a fast_dev_run to test?
|
|
||
| class GaussianBlur(object): | ||
| # Implements Gaussian blur as described in the SimCLR paper | ||
| def __init__(self, kernel_size, p=0.5, min=0.1, max=2.0): |
There was a problem hiding this comment.
Might not want to use min and max since these are Python built-ins. Could lead to potential issues.
| @@ -1,2 +1,2 @@ | |||
| pytorch-lightning>=1.0 | |||
There was a problem hiding this comment.
reverse order
torch>=1.6
pytorch-lightning>=1.0
Before submitting
What does this PR do?
Adapts swav from official implementation.
PR review
Anyone in the community is free to review the PR once the tests have passed.
If we didn't discuss your PR in Github issues there's a high chance it will not be merged.
Did you have fun?
Make sure you had fun coding 🙃