Added parameter range checks for all optimizers#6000
Merged
apaszke merged 4 commits intopytorch:masterfrom Mar 28, 2018
Merged
Conversation
vishwakftw
reviewed
Mar 26, 2018
| if not 0.0 <= lr: | ||
| raise ValueError("Invalid learning rate: {}".format(lr)) | ||
| if not 0.0 <= momentum <= 1.0: | ||
| raise ValueError("Invalid momentum value: {}".format(momentum)) |
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
vishwakftw
approved these changes
Mar 26, 2018
Contributor
vishwakftw
left a comment
There was a problem hiding this comment.
I have reviewed the rest of the ranges. Looks good to me.
Contributor
|
@pytorchbot test this please |
apaszke
approved these changes
Mar 26, 2018
Contributor
|
@pytorchbot test this please |
lazypanda1
commented
Mar 26, 2018
| optim.Adagrad(None, lr=1e-2, lr_decay=-0.5) | ||
|
|
||
| def test_adagrad_sparse(self): | ||
| self._test_rosenbrock_sparse( |
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
Contributor
|
@pytorchbot test this please |
Contributor
Author
|
Can this be merged? |
apaszke
approved these changes
Mar 28, 2018
Contributor
|
Thank you @lazypanda1! |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
This PR adds parameter range checks to all optimizers to ensure that end-users do not end up providing invalid values to the optimizers and be confused by the output when there is no actual problem with their model.
For example, running the following program produces
NaNs in the output, due to invalid value ofrho(>1.0).Output:
I tried adding constraints for all the parameters that I could infer from the corresponding articles, but I am still missing some. Please feel free to suggest what should be bound for the ones which are missing.
This is similar to the bounds check which I added for
Adam OptimizerI can also add tests if needed.