Skip to content

Add complex support for torch.nn.L1Loss#46640

Closed
anjali411 wants to merge 1 commit intogh/anjali411/68/basefrom
gh/anjali411/68/head
Closed

Add complex support for torch.nn.L1Loss#46640
anjali411 wants to merge 1 commit intogh/anjali411/68/basefrom
gh/anjali411/68/head

Conversation

@anjali411
Copy link
Copy Markdown
Contributor

@anjali411 anjali411 commented Oct 21, 2020

Stack from ghstack:

TODO:

  1. update l1_loss_backward
  2. possibly update doc

anjali411 added a commit that referenced this pull request Oct 21, 2020
ghstack-source-id: d3a1d45
Pull Request resolved: #46640
@anjali411 anjali411 added module: nn Related to torch.nn module: complex Related to complex number support in PyTorch labels Oct 21, 2020
@anjali411 anjali411 requested a review from mruberry October 21, 2020 05:19
@dr-ci
Copy link
Copy Markdown

dr-ci Bot commented Oct 21, 2020

🔗 Helpful links

💊 CI failures summary and remediations

As of commit 326a8d7 (more details on the Dr. CI page):


Commit 326a8d7 was recently pushed. Waiting for builds...


This comment was automatically generated by Dr. CI (expand for details).Follow this link to opt-out of these comments for your Pull Requests.

Please report bugs/suggestions to the (internal) Dr. CI Users group.

Click here to manually regenerate this comment.

}

Tensor& l1_loss_out(Tensor&result, const Tensor& input, const Tensor& target, int64_t reduction) {
auto diff = input.sub(target);
Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is interesting. In the previous code input - target is out of place if Reduction::None, but inplace otherwise. With this change it will always be out of place.

I'm not sure why that distinction existed previously, but it seems like this change can preserve the inplace subtraction in the else clause?

Comment thread test/test_nn.py
@@ -7056,11 +7056,12 @@ def test_pointwise_loss_broadcast(self):
# https://github.com/pytorch/pytorch/issues/27692 reports
# that l1_loss get a wrong result for big batch size
def test_l1_loss_correct(self):
Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is there no other L1 loss test that should be updated?

This is making me thing we should (eventually) create test_losses.py.

facebook-github-bot pushed a commit that referenced this pull request Jan 15, 2021
Summary:
Building on top of the work of anjali411 (#46640)

Things added in this PR:
1. Modify backward and double-backward formulas
2. Add complex support for `new module tests` and criterion tests (and add complex tests for L1)
3. Modify some existing tests to support complex

Pull Request resolved: #49912

Reviewed By: zhangguanheng66

Differential Revision: D25853036

Pulled By: soulitzer

fbshipit-source-id: df619f1b71c450ab2818eb17804e0c55990aa8ad
@github-actions
Copy link
Copy Markdown
Contributor

Looks like this PR hasn't been updated in a while so we're going to go ahead and mark this as Stale.
Feel free to remove the Stale label if you feel this was a mistake.
If you are unable to remove the Stale label please contact a maintainer in order to do so.
If you want the bot to never mark this PR stale again, add the no-stale label.
Stale pull requests will automatically be closed after 30 days of inactivity.

@github-actions github-actions Bot added the Stale label Apr 13, 2022
@anjali411 anjali411 closed this Apr 21, 2022
@facebook-github-bot facebook-github-bot deleted the gh/anjali411/68/head branch May 22, 2022 14:17
laurentdupin pushed a commit to laurentdupin/pytorch that referenced this pull request Apr 24, 2026
Summary:
Building on top of the work of anjali411 (pytorch#46640)

Things added in this PR:
1. Modify backward and double-backward formulas
2. Add complex support for `new module tests` and criterion tests (and add complex tests for L1)
3. Modify some existing tests to support complex

Pull Request resolved: pytorch#49912

Reviewed By: zhangguanheng66

Differential Revision: D25853036

Pulled By: soulitzer

fbshipit-source-id: df619f1b71c450ab2818eb17804e0c55990aa8ad
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

cla signed module: complex Related to complex number support in PyTorch module: nn Related to torch.nn Stale

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants