Skip to content

add type annotations to torch.nn.modules.normalization#49035

Closed
guilhermeleobas wants to merge 6 commits intopytorch:masterfrom
guilhermeleobas:normalization
Closed

add type annotations to torch.nn.modules.normalization#49035
guilhermeleobas wants to merge 6 commits intopytorch:masterfrom
guilhermeleobas:normalization

Conversation

@guilhermeleobas
Copy link
Copy Markdown
Collaborator

Fixes #49034

@guilhermeleobas guilhermeleobas added the module: typing Related to mypy type annotations label Dec 8, 2020
@guilhermeleobas guilhermeleobas self-assigned this Dec 8, 2020
@codecov
Copy link
Copy Markdown

codecov Bot commented Dec 8, 2020

Codecov Report

Merging #49035 (59aba29) into master (e29082b) will decrease coverage by 0.00%.
The diff coverage is 100.00%.

@@            Coverage Diff             @@
##           master   #49035      +/-   ##
==========================================
- Coverage   80.71%   80.70%   -0.01%     
==========================================
  Files        1904     1904              
  Lines      206598   206598              
==========================================
- Hits       166750   166742       -8     
- Misses      39848    39856       +8     

@guilhermeleobas guilhermeleobas marked this pull request as ready for review December 9, 2020 14:51
@H-Huang H-Huang added the triaged This issue has been looked at a team member, and triaged and prioritized into an appropriate module label Dec 10, 2020
Copy link
Copy Markdown
Collaborator

@rgommers rgommers left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks @guilhermeleobas. Just a couple of minor comments.

Comment thread torch/nn/modules/normalization.py Outdated
Comment thread torch/nn/modules/normalization.py Outdated
Copy link
Copy Markdown
Collaborator

@rgommers rgommers left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM now, thanks @guilhermeleobas

@rgommers rgommers requested review from ezyang and malfet December 19, 2020 13:15
input, self.normalized_shape, self.weight, self.bias, self.eps)

def extra_repr(self) -> Tensor:
def extra_repr(self) -> str:
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

wonder why it was not caught before.

Comment thread torch/nn/modules/normalization.py Outdated


_shape_t = Union[int, List[int], Size]
_shape_t = Union[int, Sequence[int], Size]
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

do we have a test to validate JIT-ability? since we knew Sequence & Tuple doesnt work on JIT.

Copy link
Copy Markdown
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actually this is not necessary. I am reverting this particular change

@guilhermeleobas
Copy link
Copy Markdown
Collaborator Author

Do not merge this PR until one checks if the annotations introduce any regression. See:
#49564 (comment)

Copy link
Copy Markdown
Contributor

@facebook-github-bot facebook-github-bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@walterddr has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.

@guilhermeleobas guilhermeleobas marked this pull request as ready for review January 8, 2021 15:49
@guilhermeleobas
Copy link
Copy Markdown
Collaborator Author

Tests are covered by test cases in the TestJitGeneratedModule. See:

test/test_jit.py::TestJitGeneratedModule::test_nn_LayerNorm_1d_elementwise_affine
test/test_jit.py::TestJitGeneratedModule::test_nn_LayerNorm_1d_empty_elementwise_affine
test/test_jit.py::TestJitGeneratedModule::test_nn_LayerNorm_1d_no_elementwise_affine
test/test_jit.py::TestJitGeneratedModule::test_nn_LayerNorm_3d_elementwise_affine
test/test_jit.py::TestJitGeneratedModule::test_nn_LayerNorm_3d_no_elementwise_affine

Copy link
Copy Markdown
Contributor

@walterddr walterddr left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

looks good. one minor comment.

also could you confirm the coverage for LayerNorm1d constructor arguments in torch.test._internal.common_nn actually consist of non-tuple type input? from what I see all of those constructor_args are lists. not sure if they collapse into single integer

Comment on lines +152 to +153
normalized_shape = (normalized_shape,) # type: ignore[assignment]
self.normalized_shape = tuple(normalized_shape) # type: ignore[arg-type]
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

normalized_

Suggested change
normalized_shape = (normalized_shape,) # type: ignore[assignment]
self.normalized_shape = tuple(normalized_shape) # type: ignore[arg-type]
normalized_shape_ = (normalized_shape,)
self.normalized_shape = tuple(normalized_shape_)

Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

also wondering do we even need the second tuple() wrapper?

Copy link
Copy Markdown
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I guess the second tuple() is to make sure self.normalized_shape is a tuple and not a list, for instance.

Copy link
Copy Markdown
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Also, I do not think the suggested change works as expected. If the input arg normalized_shape is a list, then the if block will never be executed and normalized_shape_ will not be defined.

Copy link
Copy Markdown
Contributor

@facebook-github-bot facebook-github-bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@walterddr has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.

@facebook-github-bot
Copy link
Copy Markdown
Contributor

@walterddr merged this pull request in 4411b5a.

laurentdupin pushed a commit to laurentdupin/pytorch that referenced this pull request Apr 24, 2026
Summary:
Fixes pytorch#49034

Pull Request resolved: pytorch#49035

Test Plan:
Imported from GitHub, without a `Test Plan:` line.
Force rebased to deal with merge conflicts

Reviewed By: zhangguanheng66

Differential Revision: D25767065

Pulled By: walterddr

fbshipit-source-id: ffb904e449f137825824e3f43f3775a55e9b011b
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

cla signed Merged module: typing Related to mypy type annotations open source triaged This issue has been looked at a team member, and triaged and prioritized into an appropriate module

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Enable torch.nn.modules.normalization typechecks during CI

7 participants