Add type annotations to torch.nn.modules.padding#49494
Add type annotations to torch.nn.modules.padding#49494guilhermeleobas wants to merge 7 commits intopytorch:masterfrom guilhermeleobas:padding
Conversation
💊 CI failures summary and remediationsAs of commit 9c15796 (more details on the Dr. CI page):
This comment was automatically generated by Dr. CI (expand for details).Follow this link to opt-out of these comments for your Pull Requests.Please report bugs/suggestions to the (internal) Dr. CI Users group. This comment has been revised 46 times. |
|
|
||
| """ | ||
| padding: _size_2_t | ||
| padding: Tuple[int, int] |
There was a problem hiding this comment.
Before merging the change, please check that this code is still JIT compilable after the change and scalar to tuple promotions work as expected? (_size_N_t are special types which get automatic promotions from integer to list)
Requesting changes to just mention in the PR description which test covers that torch.nn.modules.padding is JITable after the change
There was a problem hiding this comment.
actually in this particular class padding is always set by
self.padding = _pair(padding)
which always return a tuple of 2
|
One CI failure is unrelated. |
facebook-github-bot
left a comment
There was a problem hiding this comment.
@walterddr has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.
malfet
left a comment
There was a problem hiding this comment.
Please specify in comment what test cover that padding classes are torch.jit.scriptable after the change.
|
|
||
| """ | ||
| padding: _size_2_t | ||
| padding: Tuple[int, int] |
There was a problem hiding this comment.
Before merging the change, please check that this code is still JIT compilable after the change and scalar to tuple promotions work as expected? (_size_N_t are special types which get automatic promotions from integer to list)
Requesting changes to just mention in the PR description which test covers that torch.nn.modules.padding is JITable after the change
|
Do not merge this PR until one checks if the annotations introduce any regression. See: |
walterddr
left a comment
There was a problem hiding this comment.
looks good. let's wait for the test results
| m = Mod(nn.ConstantPad1d(2, 3.5)) | ||
| inp = torch.randn(1, 2, 4) |
There was a problem hiding this comment.
was wondering if we can create list of tuple[nn.Module, Tensor] to simplify this. but we can probably do the refactoring in another pass
facebook-github-bot
left a comment
There was a problem hiding this comment.
@walterddr has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.
facebook-github-bot
left a comment
There was a problem hiding this comment.
@walterddr has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.
|
FYI on the failure for facebook internal not sure why the added padding test failed timeout internally. I am taking a look |
| (Mod(nn.ReflectionPad2d(2)), torch.arange(9, dtype=torch.float).reshape(1, 1, 3, 3)), | ||
| (Mod(nn.ReplicationPad1d(2)), torch.arange(8, dtype=torch.float).reshape(1, 2, 4)), | ||
| (Mod(nn.ReplicationPad2d(2)), torch.arange(9, dtype=torch.float).reshape(1, 1, 3, 3)), | ||
| (Mod(nn.ReplicationPad3d(3)), torch.randn(16, 3, 8, 320, 480)), |
There was a problem hiding this comment.
just checked and seems like this test is significantly longer comparing to other TestJIT items.
do we really need 320x480 size for the 3d replication padding test for JIT? or can we use a smaller size?
FYI I changed this to 32x48 and it passed internal test without timeout.
facebook-github-bot
left a comment
There was a problem hiding this comment.
@walterddr has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.
Codecov Report
@@ Coverage Diff @@
## master #49494 +/- ##
=======================================
Coverage 80.70% 80.71%
=======================================
Files 1905 1905
Lines 206813 206817 +4
=======================================
+ Hits 166916 166927 +11
+ Misses 39897 39890 -7 |
|
@walterddr merged this pull request in 374951d. |
Summary: Closes pytorchgh-49492 Pull Request resolved: pytorch#49494 Reviewed By: mruberry Differential Revision: D25723837 Pulled By: walterddr fbshipit-source-id: 92af0100f6d9e2bb25b259f5a7fe9d449ffb6443
Closes gh-49492