add type annotations to torch.nn.modules.container#48969
add type annotations to torch.nn.modules.container#48969guilhermeleobas wants to merge 7 commits intopytorch:masterfrom guilhermeleobas:container
Conversation
💊 CI failures summary and remediationsAs of commit 3a5247e (more details on the Dr. CI page): 💚 💚 Looks good so far! There are no failures yet. 💚 💚 This comment was automatically generated by Dr. CI (expand for details).Follow this link to opt-out of these comments for your Pull Requests.Please report bugs/suggestions to the (internal) Dr. CI Users group. |
rgommers
left a comment
There was a problem hiding this comment.
LGTM, thanks @guilhermeleobas. Just adding a couple of comments would be nice.
|
Hi @rgommers, done! |
Codecov Report
@@ Coverage Diff @@
## master #48969 +/- ##
==========================================
- Coverage 80.60% 80.60% -0.01%
==========================================
Files 1879 1879
Lines 202892 202894 +2
==========================================
Hits 163543 163543
- Misses 39349 39351 +2 |
walterddr
left a comment
There was a problem hiding this comment.
looks good. thanks for the contribution!
facebook-github-bot
left a comment
There was a problem hiding this comment.
@walterddr has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.
|
Do not merge this PR until one checks if the annotations introduce any regression. See: |
walterddr
left a comment
There was a problem hiding this comment.
looks like the only thing we need to add as additional test are
model = ModuleList(...)
model += nn.Conv1d()
model.append(nn.Conv2d())
model.extend([nn.Conv1d(), nn.Conv2d())
doesn't have to be convolution modules. we just need to validate that the iadd, append, extend APIs are ok with JIT
|
Done! @walterddr Should I include tests for |
|
yes good catch. for |
|
I am having some issues to add class Param(nn.Module):
def __init__(self):
super(Param, self).__init__()
self.params = nn.ParameterList([nn.Parameter(torch.randn(10, 10)) for i in range(10)])
def forward(self, x):
# ParameterList can act as an iterable, or be indexed using ints
sz = len(self.params)
for i in range(sz):
p = self.params[i]
x = self.params[i // 2].mm(x) + p.mm(x)
return x
m = torch.jit.script(Param())And the error: def create_methods_and_properties_from_stubs(concrete_type, method_stubs, property_stubs):
method_defs = [m.def_ for m in method_stubs]
method_rcbs = [m.resolution_callback for m in method_stubs]
method_defaults = [get_default_args(m.original_method) for m in method_stubs]
property_defs = [p.def_ for p in property_stubs]
property_rcbs = [p.resolution_callback for p in property_stubs]
> concrete_type._create_methods_and_properties(property_defs, property_rcbs, method_defs, method_rcbs, method_defaults)
E RuntimeError:
E Tried to access nonexistent attribute or method '__len__' of type '__torch__.torch.nn.modules.container.ParameterList'. Did you forget to initialize an attribute in __init__()?:
E File "/home/guilhermel/git/pytorch/test/test_jit.py", line 848
E # ParameterList can act as an iterable, or be indexed using ints
E sz = len(self.params)
E ~~~ <--- HERE
E for i in range(sz):
E p = self.params[i]And iterating directly in def create_methods_and_properties_from_stubs(concrete_type, method_stubs, property_stubs):
method_defs = [m.def_ for m in method_stubs]
method_rcbs = [m.resolution_callback for m in method_stubs]
method_defaults = [get_default_args(m.original_method) for m in method_stubs]
property_defs = [p.def_ for p in property_stubs]
property_rcbs = [p.resolution_callback for p in property_stubs]
> concrete_type._create_methods_and_properties(property_defs, property_rcbs, method_defs, method_rcbs, method_defaults)
E RuntimeError:
E Only constant Sequential, ModueList, or ModuleDict can be used as an iterable:
E File "/home/guilhermel/git/pytorch/test/test_jit.py", line 847
E def forward(self, x):
E # ParameterList can act as an iterable, or be indexed using ints
E for p in self.params:
E ~~~~~~~~~~~~~~~~~~~~~
E x = p.mm(x)
E ~~~~~~~~~~ <--- HERE |
|
Last commit introduce tests for |
|
that should be fine consider the annotations on ParameterList are essentially the same ones on ModuleList |
facebook-github-bot
left a comment
There was a problem hiding this comment.
@walterddr has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.
|
test |
|
I had commit some changes by mistake in the ================================================================== test session starts ==================================================================
platform linux -- Python 3.9.1, pytest-6.2.1, py-1.10.0, pluggy-0.13.1 -- /home/guilhermel/.conda/envs/pytorch-cuda-dev/bin/python
cachedir: .pytest_cache
hypothesis profile 'default' -> database=DirectoryBasedExampleDatabase('/home/guilhermel/pytorch/.hypothesis/examples')
rootdir: /home/guilhermel/pytorch
plugins: hypothesis-6.0.0
collected 2054 items / 2051 deselected / 3 selected
test/test_nn.py::TestNN::test_ParameterList PASSED [ 33%]
test/test_nn.py::TestNN::test_parameterlistdict_pickle PASSED [ 66%]
test/test_nn.py::TestNN::test_parameterlistdict_setting_attributes PASSED [100%]Tests for |
facebook-github-bot
left a comment
There was a problem hiding this comment.
@walterddr has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.
|
rebase to try mitigate test failure |
|
Looks like CI is all green |
|
CI is not working. only github hooks were triggered. |
|
Fixed CI issue in my last commit |
facebook-github-bot
left a comment
There was a problem hiding this comment.
@walterddr has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.
|
@walterddr merged this pull request in a9e46f1. |
Summary: Fixes pytorch#48968 Pull Request resolved: pytorch#48969 Reviewed By: mrshenli Differential Revision: D25728987 Pulled By: walterddr fbshipit-source-id: 02c3aa2078f4ed6cc6edd90ffe1177d789c328a9
Fixes #48968