Skip to content

Added check and test for betas parameter in Adam optimizer#5147

Merged
soumith merged 2 commits intopytorch:masterfrom
lazypanda1:adam-optimizer-fix
Feb 12, 2018
Merged

Added check and test for betas parameter in Adam optimizer#5147
soumith merged 2 commits intopytorch:masterfrom
lazypanda1:adam-optimizer-fix

Conversation

@lazypanda1
Copy link
Copy Markdown
Contributor

This PR adds a check to prevent division by zero errors and give users friendlier error messages when using the Adam optimizer.

Currently, if one specifies the beta value of the Adam optimizer as 1.0 for the first parameter, the program fails with the error message, ZeroDivisionError: float division by zero. According to the definition of Adam, beta values should be in the range [0, 1).

Also referring #751 where I encountered and tried to fix this issue in the first place. I believe this would benefit all clients using pytorch as backend.

Comment thread test/test_optim.py Outdated
torch.randn(10),
torch.randn(5),
constructor
)

This comment was marked as off-topic.

Comment thread test/test_optim.py Outdated
lambda weight, bias: optim.Adam(
[weight, bias], lr=1e-2, betas=(1.0, 0.0), amsgrad=True),
"Invalid beta parameter at index 0: 1.0"
)

This comment was marked as off-topic.

@lazypanda1
Copy link
Copy Markdown
Contributor Author

@apaszke I have reduced the test. Please review

@apaszke
Copy link
Copy Markdown
Contributor

apaszke commented Feb 11, 2018

@pytorchbot test this please

@soumith soumith merged commit a061000 into pytorch:master Feb 12, 2018
@soumith
Copy link
Copy Markdown
Collaborator

soumith commented Feb 12, 2018

thank you @lazypanda1 !

@lazypanda1 lazypanda1 deleted the adam-optimizer-fix branch March 27, 2018 00:06
laurentdupin pushed a commit to laurentdupin/pytorch that referenced this pull request Apr 24, 2026
)

* Added check and test for betas parameter in Adam optimizer

* Simplified test
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants