Fix broadcasting issues in binary_cross_entropy_with_logits#1944
Fix broadcasting issues in binary_cross_entropy_with_logits#1944soumith merged 9 commits intopytorch:masterfrom
Conversation
apaszke
left a comment
There was a problem hiding this comment.
I don't think it's a good solution. It's better to have a input/target shape spec
|
@apaszke you mean assert that |
|
Yeah I guess that if it's an element-wise loss then they should match exactly. |
|
This was my first thought too (as the docstring also says they should be the same size - the same for cc @martinarjovsky who had some thoughts on this. |
|
(btw I think there's still an issue in my weight sizing in the test - I'll fix that once we reach a conclusion on how we should fix this issue generally) |
|
@apaszke if we add a check in However, this will break backwards compatibility as I think many people use it in the following way: target = Variable(torch.rand(64))
output = Variable(torch.rand(64,1) - 0.5)
nn.BCELoss()(sigmoid(output), target) |
|
Looking at the interface of I'm in favour of asserting |
|
How about adding strict checks to |
|
Sounds sensible - I'll update this PR with that then |
…est_bce_with_logits_gives_same_result_as_sigmooid_and_bce_loss
|
@apaszke I've made the changes and added the warning in |
|
Hello! Adding checks for sizes would be good. The thing I think it's really necessary is that behaviour in the 'with_logits' version is exactly the same as the one on probabilities [for BCE and non-binary CE]. The main thing that was worrisome to me was that a shape mismatch affected one and not the other. Cheers :) |
|
thanks Alykhan! |
| for each minibatch. | ||
| """ | ||
| if not target.is_same_size(input): | ||
| warnings.warn("Using a target size ({}) that is different to the input size ({}) is deprecated. " |
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
…09c7db (pytorch#19454) Summary: Pull Request resolved: pytorch#19454 Previous import was ad7313470a9119d7e1afda7edf1d654497ee80ab Included changes: - **[83dd6265](onnx/onnx@83dd6265)**: Add NonMaxSuppression operator (pytorch#1703) <Hector Li> - **[31ca5d6f](onnx/onnx@31ca5d6f)**: add node tests for quantized ops (pytorch#1944) <Ashwini Khade> - **[e6076c1d](onnx/onnx@e6076c1d)**: Fix test stat coverage script (pytorch#1948) <Raymond Yang> - **[ad036405](onnx/onnx@ad036405)**: Add IsInf to detect infinity values (pytorch#1884) <Wei-Sheng Chin> Differential Revision: D15010015 fbshipit-source-id: 9778757752785fe3169ad2ac606b37299aa69da6
…09c7db (#19454) Summary: Pull Request resolved: #19454 Previous import was ad7313470a9119d7e1afda7edf1d654497ee80ab Included changes: - **[83dd6265](onnx/onnx@83dd6265)**: Add NonMaxSuppression operator (#1703) <Hector Li> - **[31ca5d6f](onnx/onnx@31ca5d6f)**: add node tests for quantized ops (#1944) <Ashwini Khade> - **[e6076c1d](onnx/onnx@e6076c1d)**: Fix test stat coverage script (#1948) <Raymond Yang> - **[ad036405](onnx/onnx@ad036405)**: Add IsInf to detect infinity values (#1884) <Wei-Sheng Chin> Reviewed By: benoitsteiner Differential Revision: D15010015 fbshipit-source-id: 4b29de21de60f8e6a2db75309809a4e619c92532
…09c7db (pytorch#19454) Summary: Pull Request resolved: pytorch#19454 Previous import was ad7313470a9119d7e1afda7edf1d654497ee80ab Included changes: - **[83dd6265](onnx/onnx@83dd6265)**: Add NonMaxSuppression operator (pytorch#1703) <Hector Li> - **[31ca5d6f](onnx/onnx@31ca5d6f)**: add node tests for quantized ops (pytorch#1944) <Ashwini Khade> - **[e6076c1d](onnx/onnx@e6076c1d)**: Fix test stat coverage script (pytorch#1948) <Raymond Yang> - **[ad036405](onnx/onnx@ad036405)**: Add IsInf to detect infinity values (pytorch#1884) <Wei-Sheng Chin> Reviewed By: benoitsteiner Differential Revision: D15010015 fbshipit-source-id: 4b29de21de60f8e6a2db75309809a4e619c92532
In response to issue #1939 where this failed, due to broadcasting because
oandthave different shapes