-
Notifications
You must be signed in to change notification settings - Fork 18.6k
Removed propagate labels check in loss layers #1483
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
|
I don't understand this, how are these layers able to backprop the error to integer labels now? It doesn't look like this adds anything that accesses |
|
these layer are not able to backprop to labels, they didn't do it before and they don't do it now. |
|
So, to be clear, what I meant to say in #1448 was, "figure out why this check exists, why you are running into it, and what should be done about it". So let's think about that... Backpropagation to labels doesn't make sense, of course, because labels lack topological, let alone differentiable structure. So why does the
And what about these errors catching mistakes where differentiable things are hooked up to the loss? I'll let y'all mull that over for now. |
|
As a small motivational aside, I had to disable this check in a recent On Wed, Nov 26, 2014 at 4:27 PM, longjon notifications@github.com wrote:
|
|
I think it is because propagate_down flag is reasoning in "or" fashion so it is not atomically differentiated at single blob level. The backprop in loss layers was engineered to work directly with label from input layers (data_layers etc.) So the logic wasn't engineered to support intermediate layers that handle labels and need back propagation. |
|
Maybe we should allow to each layer have a vector of propagate_down defined Sergio 2014-11-26 13:33 GMT-08:00 Evan Shelhamer notifications@github.com:
|
|
@shelhamer This need an unified address by BVLC members. It is waiting this by 26 days. |
|
closed, new PR is #2052 |
Removed checks in the loss layers as discussed in #1448 with @longjon (point 7).
Now the loss layers don't check anymore if a backpropagation on labels is requested, this could now happen because #1482