Update padding_idx docs for EmbeddingBag to better match Embedding's#56065
Closed
jbschlosser wants to merge 2 commits intopytorch:masterfrom
Closed
Update padding_idx docs for EmbeddingBag to better match Embedding's#56065jbschlosser wants to merge 2 commits intopytorch:masterfrom
jbschlosser wants to merge 2 commits intopytorch:masterfrom
Conversation
Contributor
💊 CI failures summary and remediationsAs of commit 898765a (more details on the Dr. CI page): 💚 💚 Looks good so far! There are no failures yet. 💚 💚 This comment was automatically generated by Dr. CI (expand for details).Follow this link to opt-out of these comments for your Pull Requests.Please report bugs/suggestions to the (internal) Dr. CI Users group. |
13344a9 to
9a003e5
Compare
Contributor
|
@jbschlosser has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator. |
malfet
reviewed
Apr 21, 2021
| /// If given, pads the output with the embedding vector at `padding_idx` (initialized to zeros) whenever it encounters the index. | ||
| /// If specified, the entries at `padding_idx` do not contribute to the | ||
| /// gradient; therefore, the embedding vector at `padding_idx` is not updated | ||
| /// during training, i.e. it remains as a fixed “pad”. For a newly constructed |
Contributor
There was a problem hiding this comment.
Hmm, can you please explain what is semantic meaning behind unicode double quotes? Shouldn't this be just in single quotes, to indicate codeblock?
Suggested change
| /// during training, i.e. it remains as a fixed “pad”. For a newly constructed | |
| /// during training, i.e. it remains as a fixed ``pad``. For a newly constructed |
Contributor
|
@jbschlosser merged this pull request in 8a81c4d. |
malfet
added a commit
to malfet/pytorch
that referenced
this pull request
Apr 21, 2021
facebook-github-bot
pushed a commit
that referenced
this pull request
Apr 22, 2021
krshrimali
pushed a commit
to krshrimali/pytorch
that referenced
this pull request
May 19, 2021
…ytorch#56065) Summary: Match updated `Embedding` docs from pytorch#54026 as closely as possible. Additionally, update the C++ side `Embedding` docs, since those were missed in the previous PR. There are 6 (!) places for docs: 1. Python module form in `sparse.py` - includes an additional line about newly constructed `Embedding`s / `EmbeddingBag`s 2. Python `from_pretrained()` in `sparse.py` (refers back to module docs) 3. Python functional form in `functional.py` 4. C++ module options - includes an additional line about newly constructed `Embedding`s / `EmbeddingBag`s 5. C++ `from_pretrained()` options 6. C++ functional options Pull Request resolved: pytorch#56065 Reviewed By: malfet Differential Revision: D27908383 Pulled By: jbschlosser fbshipit-source-id: c5891fed1c9d33b4b8cd63500a14c1a77d92cc78
krshrimali
pushed a commit
to krshrimali/pytorch
that referenced
this pull request
May 19, 2021
Summary: Pull Request resolved: pytorch#56618 Reviewed By: albanD Differential Revision: D27919343 Pulled By: malfet fbshipit-source-id: 2fac8ba5f399e050463141eba225da935c97a5ce
laurentdupin
pushed a commit
to laurentdupin/pytorch
that referenced
this pull request
Apr 24, 2026
…ytorch#56065) Summary: Match updated `Embedding` docs from pytorch#54026 as closely as possible. Additionally, update the C++ side `Embedding` docs, since those were missed in the previous PR. There are 6 (!) places for docs: 1. Python module form in `sparse.py` - includes an additional line about newly constructed `Embedding`s / `EmbeddingBag`s 2. Python `from_pretrained()` in `sparse.py` (refers back to module docs) 3. Python functional form in `functional.py` 4. C++ module options - includes an additional line about newly constructed `Embedding`s / `EmbeddingBag`s 5. C++ `from_pretrained()` options 6. C++ functional options Pull Request resolved: pytorch#56065 Reviewed By: malfet Differential Revision: D27908383 Pulled By: jbschlosser fbshipit-source-id: c5891fed1c9d33b4b8cd63500a14c1a77d92cc78
laurentdupin
pushed a commit
to laurentdupin/pytorch
that referenced
this pull request
Apr 24, 2026
Summary: Pull Request resolved: pytorch#56618 Reviewed By: albanD Differential Revision: D27919343 Pulled By: malfet fbshipit-source-id: 2fac8ba5f399e050463141eba225da935c97a5ce
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Match updated
Embeddingdocs from #54026 as closely as possible. Additionally, update the C++ sideEmbeddingdocs, since those were missed in the previous PR.There are 6 (!) places for docs:
sparse.py- includes an additional line about newly constructedEmbeddings /EmbeddingBagsfrom_pretrained()insparse.py(refers back to module docs)functional.pyEmbeddings /EmbeddingBagsfrom_pretrained()options