-
Notifications
You must be signed in to change notification settings - Fork 27.3k
Open
Labels
module: lossProblem is related to loss functionProblem is related to loss functionmodule: nnRelated to torch.nnRelated to torch.nntriagedThis issue has been looked at a team member, and triaged and prioritized into an appropriate moduleThis issue has been looked at a team member, and triaged and prioritized into an appropriate module
Description
🚀 The feature, motivation and pitch
POLYLOSS: A POLYNOMIAL EXPANSION PERSPEC- TIVE OF CLASSIFICATION LOSS FUNCTIONS
(Published as a conference paper at ICLR 2022) link to paper
This loss function implementation won't be difficult as we can use builtin crossentropy loss or focal loss to implement it. This loss function has overperformed other existing loss functions in some benchmark experiments.
Here is a code snippet
# cross entropy as pollyloss
def poly1_cross_entropy(logits, labels, epsilon=1.0):
# pt, CE, and Poly1 have shape [batch].
pt = tf.reduce_sum(labels * tf.nn.softmax(logits), axis=-1) CE = tf.nn.softmax_cross_entropy_with_logits(labels, logits)
Poly1 = CE + epsilon * (1 - pt)
return Poly1
# focal loss as pollyloss
def poly1_focal_loss(logits, labels, epsilon=1.0, gamma=2.0):
# p, pt, FL, and Poly1 have shape [batch, num of classes]. p = tf.math.sigmoid(logits)
pt = labels * p + (1 - labels) * (1 - p)
FL = focal_loss(pt, gamma)
Poly1 = FL + epsilon * tf.math.pow(1 - pt, gamma + 1)
return Poly1I would like to contribute to add this feature into torch
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
module: lossProblem is related to loss functionProblem is related to loss functionmodule: nnRelated to torch.nnRelated to torch.nntriagedThis issue has been looked at a team member, and triaged and prioritized into an appropriate moduleThis issue has been looked at a team member, and triaged and prioritized into an appropriate module