Add Dropout1d module#79545
Conversation
[ghstack-poisoned]
🔗 Helpful links
❌ 18 New FailuresAs of commit d90de21 (more details on the Dr. CI page): Expand to see more
🕵️ 17 new failures recognized by patternsThe following CI failures do not appear to be due to upstream breakages
|
| Job | Step | Action |
|---|---|---|
| Unknown | 🔁 rerun |
This comment was automatically generated by Dr. CI (expand for details).
Please report bugs/suggestions to the (internal) Dr. CI Users group.
Fixes #6442 [ghstack-poisoned]
Fixes #6442 [ghstack-poisoned]
Fixes #6442 [ghstack-poisoned]
Fixes #6442 [ghstack-poisoned]
| return _VF.alpha_dropout_(input, p, training) if inplace else _VF.alpha_dropout(input, p, training) | ||
|
|
||
|
|
||
| def dropout1d(input: Tensor, p: float = 0.5, training: bool = True, inplace: bool = False) -> Tensor: |
There was a problem hiding this comment.
should there be feature_dropout_nd function that factors out common code between 1d/2d/3d? Doesn't need to happen now
There was a problem hiding this comment.
flagged for later - this should definitely happen
|
|
||
|
|
||
| class Dropout1d(_DropoutNd): | ||
| r"""Randomly zero out entire channels (a channel is a 1D feature map, |
There was a problem hiding this comment.
same, doc strings can probably be shared between 1d/2d/3d
|
@pytorchbot merge |
|
@pytorchbot successfully started a merge job. Check the current status here |
|
Hey @jbschlosser. |
Pull Request resolved: #79545 Approved by: https://github.com/ngimel, https://github.com/albanD Co-authored-by: Joel Benjamin Schlosser <jbschlosser@fb.com>
Summary: Pull Request resolved: #79545 Approved by: https://github.com/ngimel, https://github.com/albanD Test Plan: contbuild & OSS CI, see https://hud.pytorch.org/commit/pytorch/pytorch/2d73c8e6e0378655f732a48ec50ae1908ce0a4a4 Reviewed By: malfet Differential Revision: D37208288 Pulled By: jbschlosser fbshipit-source-id: df5e95c9a305e50abc0fbd73dbfc63fbeb173f8d
Pull Request resolved: pytorch#79545 Approved by: https://github.com/ngimel, https://github.com/albanD
Stack from ghstack:
Fixes #6442