Skip to content

Feature Request: ReLU on LSTMs and GRUs #1932

@samhaaf

Description

@samhaaf

Hello, I would really like the functionality to use a different activation function on the output of the RNNs. I have found ReLU more useful in classification models because, for instance, tanh output from an LSTM makes it easy for a subsequent softmax-linear layer to produce values near .999. Just throwing the idea out there incase someone wants to include that in an upcoming release.

cc @zou3519

Metadata

Metadata

Assignees

No one assigned

    Labels

    featureA request for a proper, new feature.module: nnRelated to torch.nnmodule: rnnIssues related to RNN support (LSTM, GRU, etc)triagedThis issue has been looked at a team member, and triaged and prioritized into an appropriate module

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions