Skip to content

[C++ API] A few additions#9837

Closed
ebetica wants to merge 9 commits intopytorch:masterfrom
ebetica:virtual_superclass
Closed

[C++ API] A few additions#9837
ebetica wants to merge 9 commits intopytorch:masterfrom
ebetica:virtual_superclass

Conversation

@ebetica
Copy link
Contributor

@ebetica ebetica commented Jul 25, 2018

This PR provides 4 fixes / features:

  1. torch::nn::Cloneable inherits virtually from torch::nn::Module. We want to pass around a module with new functions, and the best way to do this is to do a diamond inheritance pattern, i.e.
struct MySuperModuleImpl : virtual public torch::nn::Module { 
  virtual void myFunction() = 0;
}

struct MySuperModule : public torch::nn::Cloneable<MySuperModule>, MySuperModuleImple {};

struct MyModule : public MySuperModule<MyModule> {
  void myFunction() override; 
};

This way, we can simply pass around MySuperModuleImpl around instead of torch::nn::Module.

  1. Optimizer options are public now, since there's no way to decay the LR or modify it during training otherwise
  2. Serialization functions creates autograd history and calls copy_! Bad!
  3. Optimizers did not create buffers after add_parameters was called.

@goldsborough
Copy link
Contributor

test failures look legit though, something in serialization

@ebetica
Copy link
Contributor Author

ebetica commented Jul 30, 2018

Hmm, the only thing I changed was that grad() returns a Variable, I'll try changing that back.

@weiyangfb weiyangfb added the ready for review (this tag is deprecated) All PRs are ready for review unless they are draft, WIP, or have undismissed requested changes label Jul 31, 2018
Copy link
Contributor

@goldsborough goldsborough left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM, let's just get the tests working

This comment was marked as off-topic.

This comment was marked as off-topic.

@goldsborough
Copy link
Contributor

@pytorchbot retest this please

Copy link
Contributor

@facebook-github-bot facebook-github-bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

goldsborough has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.

cputensor.data_ptr(),
cputensor.numel() * cputensor.type().elementSizeInBytes());
tensor.data().copy_(cputensor.data());
tensor.copy_(cputensor);

This comment was marked as off-topic.

This comment was marked as off-topic.

@ebetica
Copy link
Contributor Author

ebetica commented Aug 7, 2018

Could you change the tensor.copy_ in load to use the data instead of the variable? @goldsborough

Copy link
Contributor

@facebook-github-bot facebook-github-bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

goldsborough is landing this pull request. If you are a Facebook employee, you can view this diff on Phabricator.

Copy link
Contributor

@facebook-github-bot facebook-github-bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

goldsborough is landing this pull request. If you are a Facebook employee, you can view this diff on Phabricator.

Copy link
Contributor

@facebook-github-bot facebook-github-bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

goldsborough is landing this pull request. If you are a Facebook employee, you can view this diff on Phabricator.

goodlux pushed a commit to goodlux/pytorch that referenced this pull request Aug 15, 2018
Summary:
This PR provides 4 fixes / features:

1. torch::nn::Cloneable inherits virtually from torch::nn::Module. We want to pass around a module with new functions, and the best way to do this is to do a diamond inheritance pattern, i.e.

```c++
struct MySuperModuleImpl : virtual public torch::nn::Module {
  virtual void myFunction() = 0;
}

struct MySuperModule : public torch::nn::Cloneable<MySuperModule>, MySuperModuleImple {};

struct MyModule : public MySuperModule<MyModule> {
  void myFunction() override;
};
```

This way, we can simply pass around MySuperModuleImpl around instead of torch::nn::Module.

2. Optimizer options are public now, since there's no way to decay the LR or modify it during training otherwise
3. Serialization functions creates autograd history and calls copy_! Bad!
4. Optimizers did not create buffers after add_parameters was called.
Pull Request resolved: pytorch#9837

Reviewed By: goldsborough

Differential Revision: D9199746

Pulled By: ebetica

fbshipit-source-id: 76d6b22e589a42637b7cc0b5bcd3c6b6662fb299
@ezyang ezyang added the merged label Jun 26, 2019
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

ready for review (this tag is deprecated) All PRs are ready for review unless they are draft, WIP, or have undismissed requested changes

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants