Skip to content

Update padding_idx docs for EmbeddingBag to better match Embedding's#56065

Closed
jbschlosser wants to merge 2 commits intopytorch:masterfrom
jbschlosser:eb_padding_idx_docs
Closed

Update padding_idx docs for EmbeddingBag to better match Embedding's#56065
jbschlosser wants to merge 2 commits intopytorch:masterfrom
jbschlosser:eb_padding_idx_docs

Conversation

@jbschlosser
Copy link
Copy Markdown
Contributor

@jbschlosser jbschlosser commented Apr 14, 2021

Match updated Embedding docs from #54026 as closely as possible. Additionally, update the C++ side Embedding docs, since those were missed in the previous PR.

There are 6 (!) places for docs:

  1. Python module form in sparse.py - includes an additional line about newly constructed Embeddings / EmbeddingBags
  2. Python from_pretrained() in sparse.py (refers back to module docs)
  3. Python functional form in functional.py
  4. C++ module options - includes an additional line about newly constructed Embeddings / EmbeddingBags
  5. C++ from_pretrained() options
  6. C++ functional options

@facebook-github-bot
Copy link
Copy Markdown
Contributor

facebook-github-bot commented Apr 14, 2021

💊 CI failures summary and remediations

As of commit 898765a (more details on the Dr. CI page):


💚 💚 Looks good so far! There are no failures yet. 💚 💚


This comment was automatically generated by Dr. CI (expand for details).Follow this link to opt-out of these comments for your Pull Requests.

Please report bugs/suggestions to the (internal) Dr. CI Users group.

@jbschlosser jbschlosser removed the request for review from glaringlee April 15, 2021 18:06
Copy link
Copy Markdown
Collaborator

@kurtamohler kurtamohler left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nice improvement

@jbschlosser jbschlosser force-pushed the eb_padding_idx_docs branch from 13344a9 to 9a003e5 Compare April 20, 2021 20:25
@facebook-github-bot
Copy link
Copy Markdown
Contributor

@jbschlosser has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.

/// If given, pads the output with the embedding vector at `padding_idx` (initialized to zeros) whenever it encounters the index.
/// If specified, the entries at `padding_idx` do not contribute to the
/// gradient; therefore, the embedding vector at `padding_idx` is not updated
/// during training, i.e. it remains as a fixed “pad”. For a newly constructed
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hmm, can you please explain what is semantic meaning behind unicode double quotes? Shouldn't this be just in single quotes, to indicate codeblock?

Suggested change
/// during training, i.e. it remains as a fixed pad. For a newly constructed
/// during training, i.e. it remains as a fixed ``pad``. For a newly constructed

@facebook-github-bot
Copy link
Copy Markdown
Contributor

@jbschlosser merged this pull request in 8a81c4d.

malfet added a commit to malfet/pytorch that referenced this pull request Apr 21, 2021
facebook-github-bot pushed a commit that referenced this pull request Apr 22, 2021
Summary: Pull Request resolved: #56618

Reviewed By: albanD

Differential Revision: D27919343

Pulled By: malfet

fbshipit-source-id: 2fac8ba5f399e050463141eba225da935c97a5ce
krshrimali pushed a commit to krshrimali/pytorch that referenced this pull request May 19, 2021
…ytorch#56065)

Summary:
Match updated `Embedding` docs from pytorch#54026 as closely as possible. Additionally, update the C++ side `Embedding` docs, since those were missed in the previous PR.

There are 6 (!) places for docs:
1. Python module form in `sparse.py` - includes an additional line about newly constructed `Embedding`s / `EmbeddingBag`s
2. Python `from_pretrained()` in `sparse.py` (refers back to module docs)
3. Python functional form in `functional.py`
4. C++ module options - includes an additional line about newly constructed `Embedding`s / `EmbeddingBag`s
5. C++ `from_pretrained()` options
6. C++ functional options

Pull Request resolved: pytorch#56065

Reviewed By: malfet

Differential Revision: D27908383

Pulled By: jbschlosser

fbshipit-source-id: c5891fed1c9d33b4b8cd63500a14c1a77d92cc78
krshrimali pushed a commit to krshrimali/pytorch that referenced this pull request May 19, 2021
Summary: Pull Request resolved: pytorch#56618

Reviewed By: albanD

Differential Revision: D27919343

Pulled By: malfet

fbshipit-source-id: 2fac8ba5f399e050463141eba225da935c97a5ce
laurentdupin pushed a commit to laurentdupin/pytorch that referenced this pull request Apr 24, 2026
…ytorch#56065)

Summary:
Match updated `Embedding` docs from pytorch#54026 as closely as possible. Additionally, update the C++ side `Embedding` docs, since those were missed in the previous PR.

There are 6 (!) places for docs:
1. Python module form in `sparse.py` - includes an additional line about newly constructed `Embedding`s / `EmbeddingBag`s
2. Python `from_pretrained()` in `sparse.py` (refers back to module docs)
3. Python functional form in `functional.py`
4. C++ module options - includes an additional line about newly constructed `Embedding`s / `EmbeddingBag`s
5. C++ `from_pretrained()` options
6. C++ functional options

Pull Request resolved: pytorch#56065

Reviewed By: malfet

Differential Revision: D27908383

Pulled By: jbschlosser

fbshipit-source-id: c5891fed1c9d33b4b8cd63500a14c1a77d92cc78
laurentdupin pushed a commit to laurentdupin/pytorch that referenced this pull request Apr 24, 2026
Summary: Pull Request resolved: pytorch#56618

Reviewed By: albanD

Differential Revision: D27919343

Pulled By: malfet

fbshipit-source-id: 2fac8ba5f399e050463141eba225da935c97a5ce
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants