[WIP] Implement twice backward of ConvNd#1555
[WIP] Implement twice backward of ConvNd#1555caogang wants to merge 24 commits intopytorch:masterfrom
Conversation
* master: Add F.normalize (pytorch#1467) Expose custom attributes from C++ functions (pytorch#1430) Add high order gradient support for Sigmoid (pytorch#1496)
* conv: debug the segment fault of ConvBackwardBackward Fix the compile Error Change the methods called in ConvBackwardBackward Add twice differentiate for cudnn_conv Fix linear bug Using mask_fill instead and modify the allocation in inplace mode Modify ‘norm_type’ to ‘p’ and add TODO notes Using expand_as instead Add high order support for norm operator Add high order support for sqrt operator Modify the coding-style to satisfy PEP-8 Simplify the implementation of relu and threshold operator Add high order support for relu and threshold operator using grad_output.size() instead and no need to do zero_() set grad_input volatile=True Add case : grad_output.volatile Modify the return value of sigmoid Modify the sigmoid in torch.nn.functional.py Fixed : a small bug Add twice differentiation of sigmoid op # Conflicts: # torch/autograd/_functions/reduce.py # torch/autograd/variable.py # torch/csrc/autograd/functions/init.cpp # torch/nn/_functions/linear.py # torch/nn/_functions/thnn/activation.py
apaszke
left a comment
There was a problem hiding this comment.
As you noticed, ConvBackwardBackward is pretty much the same as ConvForward modulo some minor parameter changes. For this reason it's better to use the forward function to implement it, because if you do it this way, it will be differentiable as many times as you want. This implementation would only work for grad of grad.
|
Hi, @apaszke . Ok, I will change it to ConvForward. But I want to confirm whether the logic of this implementation is correct before going to change that. Can you figure out the correctness? |
|
@albanD Can |
|
calling |
|
@albanD Oh, that's great. I will check it. Thanks! |
|
You'll need to add a constructor for ConvNdBackward (right now it's |
|
Hi, all . @albanD @apaszke |
|
Please refer to the PR #1569 for further discussion. |
Hi, all
I have tried to implement the backward of backward of ConNd. And I have running an example. The example running without error, but I am not sure if the results are right.
Please review it, and if there is any problem, just tell me.