Doc update for complex numbers#51129
Closed
anjali411 wants to merge 2 commits intogh/anjali411/90/basefrom
Closed
Conversation
[ghstack-poisoned]
Contributor
💊 CI failures summary and remediationsAs of commit 1bb21a0 (more details on the Dr. CI page): 💚 💚 Looks good so far! There are no failures yet. 💚 💚 This comment was automatically generated by Dr. CI (expand for details).Follow this link to opt-out of these comments for your Pull Requests.Please report bugs/suggestions to the (internal) Dr. CI Users group. |
mruberry
reviewed
Jan 26, 2021
mruberry
reviewed
Jan 26, 2021
[ghstack-poisoned]
Codecov Report
@@ Coverage Diff @@
## gh/anjali411/90/base #51129 +/- ##
========================================================
- Coverage 80.88% 80.54% -0.35%
========================================================
Files 1931 1931
Lines 210560 210560
========================================================
- Hits 170311 169593 -718
- Misses 40249 40967 +718 |
albanD
approved these changes
Jan 27, 2021
| Spectral operations (e.g., :func:`torch.fft`, :func:`torch.stft` etc.) currently don't use complex tensors but | ||
| the API will be soon updated to use complex tensors. | ||
| Spectral operations in the `torch.fft module <https://pytorch.org/docs/stable/fft.html#torch-fft>`_ support | ||
| native complex tensors. |
Collaborator
There was a problem hiding this comment.
nit: "complex tensors natively"?
| functions soon: :func:`torch.matmul`, :func:`torch.solve`, :func:`torch.eig`, | ||
| :func:`torch.symeig`. If any of these would help your use case, please | ||
| `search <https://github.com/pytorch/pytorch/issues?q=is%3Aissue+is%3Aopen+complex>`_ | ||
| Many linear algebra operations, like :func:`torch.matmul`, :func:`torch.svd`, :func:`torch.solve` etc., support complex numbers. |
Collaborator
There was a problem hiding this comment.
Should the , be before the etc.?
| you get the regular complex gradient. For :math:`C → R` real-valued loss functions, | ||
| `grad.conj()` gives a descent direction. For more details, check out the note :ref:`complex_autograd-doc`. | ||
| PyTorch supports autograd for complex tensors. The gradient computed is the Conjugate Wirtinger derivative, | ||
| the negative of which is precisely the direction of steepest descent used in Gradient Descent algorithm. Thus, |
Collaborator
There was a problem hiding this comment.
nit: that should be used in Gradient Descent
Contributor
|
@anjali411 merged this pull request in fd9a85d. |
laurentdupin
pushed a commit
to laurentdupin/pytorch
that referenced
this pull request
Apr 24, 2026
Summary: Pull Request resolved: pytorch#51129 Test Plan: Imported from OSS Reviewed By: pbelevich Differential Revision: D26094947 Pulled By: anjali411 fbshipit-source-id: 4e1cdf8915a8c6a86ac3462685cdce881e1bcffa
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Stack from ghstack:
Differential Revision: D26094947