Skip to content

Adding the CReLU Activation #1327

@keskarnitish

Description

@keskarnitish

I was trying to reproduce some of the experiments of the new paper on shattered gradients and needed to use the CReLU activation which was unavailable in PyTorch. Unfortunately, it doesn't seem to be a trivial extension of ReLU to code. (Should?) Could we add this functionality to PyTorch?

Implementations of CReLU can be found in:
Lua Torch
Chainer
Tensorflow

If I am not mistaken, this function would be added to nn/_functions/thnn/activation?

Update:

For CNNs and Linear layers, I think I have a temporary solution.

class CReLU(nn.Module):
    def __init__(self):
        super(CReLU, self).__init__()
    def forward(self, x):
        return torch.cat((F.relu(x), F.relu(-x)), 1)

Obviously, there are some issues with this (hard-coded axis argument, doesn't allow inplace operations etc.)

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions