Skip to content

Doc update for complex numbers#51129

Closed
anjali411 wants to merge 2 commits intogh/anjali411/90/basefrom
gh/anjali411/90/head
Closed

Doc update for complex numbers#51129
anjali411 wants to merge 2 commits intogh/anjali411/90/basefrom
gh/anjali411/90/head

Conversation

@anjali411
Copy link
Copy Markdown
Contributor

@anjali411 anjali411 commented Jan 26, 2021

Stack from ghstack:

Differential Revision: D26094947

anjali411 added a commit that referenced this pull request Jan 26, 2021
ghstack-source-id: 39ec88e
Pull Request resolved: #51129
@facebook-github-bot
Copy link
Copy Markdown
Contributor

facebook-github-bot commented Jan 26, 2021

💊 CI failures summary and remediations

As of commit 1bb21a0 (more details on the Dr. CI page):


💚 💚 Looks good so far! There are no failures yet. 💚 💚


This comment was automatically generated by Dr. CI (expand for details).Follow this link to opt-out of these comments for your Pull Requests.

Please report bugs/suggestions to the (internal) Dr. CI Users group.

@anjali411 anjali411 requested review from albanD and mruberry January 26, 2021 19:17
@anjali411 anjali411 added complex_autograd module: complex Related to complex number support in PyTorch labels Jan 26, 2021
@anjali411 anjali411 requested a review from ezyang January 26, 2021 19:18
Comment thread docs/source/complex_numbers.rst Outdated
Comment thread docs/source/complex_numbers.rst Outdated
Copy link
Copy Markdown
Collaborator

@mruberry mruberry left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Cool!

anjali411 added a commit that referenced this pull request Jan 27, 2021
ghstack-source-id: b48afd0
Pull Request resolved: #51129
@codecov
Copy link
Copy Markdown

codecov Bot commented Jan 27, 2021

Codecov Report

Merging #51129 (1bb21a0) into gh/anjali411/90/base (ba316a7) will decrease coverage by 0.34%.
The diff coverage is n/a.

@@                   Coverage Diff                    @@
##           gh/anjali411/90/base   #51129      +/-   ##
========================================================
- Coverage                 80.88%   80.54%   -0.35%     
========================================================
  Files                      1931     1931              
  Lines                    210560   210560              
========================================================
- Hits                     170311   169593     -718     
- Misses                    40249    40967     +718     

Copy link
Copy Markdown
Collaborator

@albanD albanD left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

lgtm just some nits

Spectral operations (e.g., :func:`torch.fft`, :func:`torch.stft` etc.) currently don't use complex tensors but
the API will be soon updated to use complex tensors.
Spectral operations in the `torch.fft module <https://pytorch.org/docs/stable/fft.html#torch-fft>`_ support
native complex tensors.
Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nit: "complex tensors natively"?

functions soon: :func:`torch.matmul`, :func:`torch.solve`, :func:`torch.eig`,
:func:`torch.symeig`. If any of these would help your use case, please
`search <https://github.com/pytorch/pytorch/issues?q=is%3Aissue+is%3Aopen+complex>`_
Many linear algebra operations, like :func:`torch.matmul`, :func:`torch.svd`, :func:`torch.solve` etc., support complex numbers.
Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should the , be before the etc.?

you get the regular complex gradient. For :math:`C → R` real-valued loss functions,
`grad.conj()` gives a descent direction. For more details, check out the note :ref:`complex_autograd-doc`.
PyTorch supports autograd for complex tensors. The gradient computed is the Conjugate Wirtinger derivative,
the negative of which is precisely the direction of steepest descent used in Gradient Descent algorithm. Thus,
Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nit: that should be used in Gradient Descent

@facebook-github-bot
Copy link
Copy Markdown
Contributor

@anjali411 merged this pull request in fd9a85d.

@facebook-github-bot facebook-github-bot deleted the gh/anjali411/90/head branch January 31, 2021 15:18
laurentdupin pushed a commit to laurentdupin/pytorch that referenced this pull request Apr 24, 2026
Summary: Pull Request resolved: pytorch#51129

Test Plan: Imported from OSS

Reviewed By: pbelevich

Differential Revision: D26094947

Pulled By: anjali411

fbshipit-source-id: 4e1cdf8915a8c6a86ac3462685cdce881e1bcffa
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

cla signed complex_autograd Merged module: complex Related to complex number support in PyTorch

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants