-
Notifications
You must be signed in to change notification settings - Fork 27.2k
Open
Labels
featureA request for a proper, new feature.A request for a proper, new feature.module: nnRelated to torch.nnRelated to torch.nnmodule: rnnIssues related to RNN support (LSTM, GRU, etc)Issues related to RNN support (LSTM, GRU, etc)triagedThis issue has been looked at a team member, and triaged and prioritized into an appropriate moduleThis issue has been looked at a team member, and triaged and prioritized into an appropriate module
Description
Hello, I would really like the functionality to use a different activation function on the output of the RNNs. I have found ReLU more useful in classification models because, for instance, tanh output from an LSTM makes it easy for a subsequent softmax-linear layer to produce values near .999. Just throwing the idea out there incase someone wants to include that in an upcoming release.
cc @zou3519
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
featureA request for a proper, new feature.A request for a proper, new feature.module: nnRelated to torch.nnRelated to torch.nnmodule: rnnIssues related to RNN support (LSTM, GRU, etc)Issues related to RNN support (LSTM, GRU, etc)triagedThis issue has been looked at a team member, and triaged and prioritized into an appropriate moduleThis issue has been looked at a team member, and triaged and prioritized into an appropriate module