Conversation
- Add convenience wrapper to pass tensors as input_lengths, target_lengths - Fix documentation example - Check BLANK >= 0 Thank you, Simon and Soumith for the suggestions
soumith
left a comment
There was a problem hiding this comment.
can you add the allowed types for each input.
For example:
input_lengths: Tensor or tuple of size :math:`(N)`.
| - func: ctc_loss(Tensor log_probs, Tensor targets, IntList input_lengths, IntList target_lengths, int64_t blank=0, int64_t reduction=Reduction::ElementwiseMean) -> Tensor | ||
| variants: function | ||
|
|
||
| - func: ctc_loss(Tensor log_probs, Tensor targets, Tensor input_lengths, Tensor target_lengths, int64_t blank=0, int64_t reduction=Reduction::ElementwiseMean) -> Tensor |
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
| targets: Tensor of size :math:`(N, S)` or `(sum(target_lenghts))`. | ||
| Targets (cannot be blank). In the second form, the targets are assumed to be concatenated. | ||
| input_lengths: :math:`(N)`. | ||
| input_lengths: Tuple or tensor of size :math:`(N)`. |
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
torch/nn/modules/loss.py
Outdated
| >>> targets = torch.randint(1, 21, (16, 30), dtype=torch.long) | ||
| >>> input_lengths = torch.full((16,), 50, dtype=torch.long) | ||
| >>> target_lengths = torch.randint(10,30,(16,), dtype=torch.long) | ||
| >>> input_lengths = torch.full((16,), 50, dtype=torch.long).toarray() |
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
facebook-github-bot
left a comment
There was a problem hiding this comment.
SsnL is landing this pull request. If you are a Facebook employee, you can view this diff on Phabricator.
Summary: - Add convenience wrapper to pass tensors as input_lengths, target_lengths - Fix documentation example - Check BLANK >= 0 Thank you, Simon and Soumith for the suggestions! Pull Request resolved: pytorch/pytorch#10112 Differential Revision: D9130737 Pulled By: SsnL fbshipit-source-id: f9a0022a969788bda3db9f360e2564b519ebf2e6
Summary: - Add convenience wrapper to pass tensors as input_lengths, target_lengths - Fix documentation example - Check BLANK >= 0 Thank you, Simon and Soumith for the suggestions! Pull Request resolved: pytorch#10112 Differential Revision: D9130737 Pulled By: SsnL fbshipit-source-id: f9a0022a969788bda3db9f360e2564b519ebf2e6
Thank you, Simon and Soumith for the suggestions!