Skip to content

Opitmize LayerNormOp#13173

Closed
xiaomengy wants to merge 1 commit intopytorch:masterfrom
xiaomengy:export-D12398163
Closed

Opitmize LayerNormOp#13173
xiaomengy wants to merge 1 commit intopytorch:masterfrom
xiaomengy:export-D12398163

Conversation

@xiaomengy
Copy link
Copy Markdown
Contributor

Summary: Opitmize LayerNormOp

Differential Revision: D12398163

Copy link
Copy Markdown
Member

@houseroad houseroad left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LG

@xiaomengy
Copy link
Copy Markdown
Contributor Author

Some benchmark.
LayerNormGradientOp on CPU, Tensor Shape = [100, 100, 2000], axis = 1, 841.64ms -> 57.48ms.
LayerNormOp on GPU, Tensor Shape = [100, 1000, 2000], axis = 1, 1439.45ms -> 16.18ms
LauyerNormGradientOp on GPU, Tensor Shape = [100, 1000, 2000], axis = 1, 2148.43ms -> 22.27ms

@bddppq
Copy link
Copy Markdown
Contributor

bddppq commented Oct 26, 2018

Summary:
Pull Request resolved: pytorch#13173

Opitmize LayerNormOp

Reviewed By: houseroad

Differential Revision: D12398163

fbshipit-source-id: 96fae6efd2b3eeb5917c9400a270f69b1115f360
@xiaomengy xiaomengy deleted the export-D12398163 branch October 27, 2018 00:02
@ezyang ezyang added the merged label Jun 25, 2019
laurentdupin pushed a commit to laurentdupin/pytorch that referenced this pull request Apr 24, 2026
Summary:
Pull Request resolved: pytorch#13173

Opitmize LayerNormOp

Reviewed By: houseroad

Differential Revision: D12398163

fbshipit-source-id: 6b76bc4bd9f34e623f8e385dd07d4ce99490badf
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants