I was trying to reproduce some of the experiments of the new paper on shattered gradients and needed to use the CReLU activation which was unavailable in PyTorch. Unfortunately, it doesn't seem to be a trivial extension of ReLU to code. (Should?) Could we add this functionality to PyTorch?
Implementations of CReLU can be found in:
Lua Torch
Chainer
Tensorflow
If I am not mistaken, this function would be added to nn/_functions/thnn/activation?
Update:
For CNNs and Linear layers, I think I have a temporary solution.
class CReLU(nn.Module):
def __init__(self):
super(CReLU, self).__init__()
def forward(self, x):
return torch.cat((F.relu(x), F.relu(-x)), 1)
Obviously, there are some issues with this (hard-coded axis argument, doesn't allow inplace operations etc.)
I was trying to reproduce some of the experiments of the new paper on shattered gradients and needed to use the CReLU activation which was unavailable in PyTorch. Unfortunately, it doesn't seem to be a trivial extension of ReLU to code. (Should?) Could we add this functionality to PyTorch?
Implementations of CReLU can be found in:
Lua Torch
Chainer
Tensorflow
If I am not mistaken, this function would be added to
nn/_functions/thnn/activation?Update:
For CNNs and Linear layers, I think I have a temporary solution.
Obviously, there are some issues with this (hard-coded
axisargument, doesn't allow inplace operations etc.)