[C++ API] Add OptimizerBase::add_parameters#9472
[C++ API] Add OptimizerBase::add_parameters#9472goldsborough wants to merge 3 commits intopytorch:masterfrom
Conversation
ebetica
left a comment
There was a problem hiding this comment.
Note: one minor issue is that if any of the optimizers does significant work in its constructor, this might not work. Thus, add_parameters should definitely be virtual (and one of them should call the other so we don't have to implement two different add_parameter functions).
|
I think making them virtual makes sense. To make one call the other, I'll have to update |
|
FWIW this is not supported by the Python API (we do have |
|
@apaszke Are you saying that the Python API should have this functionality, or that the C++ API shouldn't? ;) |
|
I didn't implement the whole parameter grouping functionality to begin with. It wasn't in autogradpp, and it seemed the better decision to start with only one parameter group than to have the added complexity of multiple parameter groups. Also, it can always be replicated with multiple optimizers right? TF doesn't have this parameter grouping behavior in their optimizers for example |
facebook-github-bot
left a comment
There was a problem hiding this comment.
@goldsborough is landing this pull request. If you are a Facebook employee, you can view this diff on Phabricator.
|
@ezyang I'm saying that they're inconsistent :) Yeah, multiple optimizers give you the same results, they're just more annoying to maintain. I guess it's ok to leave it like that for now. |
|
@apaszke I guess the core issue is that state_dict() and load_state_dict() are not present. Implement those would make me happy too. Just any way to load an optimizier's state. |
56bebda to
b2eec6a
Compare
facebook-github-bot
left a comment
There was a problem hiding this comment.
@goldsborough has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.
Summary: ebetica asked for a way to add parameters to `Optimizer`s after they are created. ebetica ezyang Pull Request resolved: pytorch#9472 Differential Revision: D8872176 Pulled By: goldsborough fbshipit-source-id: 39a4032c519a6d3b458dd3596361b04afea10365
Summary: ebetica asked for a way to add parameters to `Optimizer`s after they are created. ebetica ezyang Pull Request resolved: pytorch#9472 Differential Revision: D8872176 Pulled By: goldsborough fbshipit-source-id: 39a4032c519a6d3b458dd3596361b04afea10365
@ebetica asked for a way to add parameters to
Optimizers after they are created.@ebetica @ezyang