Skip to content

Adding Polyloss to torch #76732

@tanmoyio

Description

@tanmoyio

🚀 The feature, motivation and pitch

POLYLOSS: A POLYNOMIAL EXPANSION PERSPEC- TIVE OF CLASSIFICATION LOSS FUNCTIONS

(Published as a conference paper at ICLR 2022) link to paper

This loss function implementation won't be difficult as we can use builtin crossentropy loss or focal loss to implement it. This loss function has overperformed other existing loss functions in some benchmark experiments.

Here is a code snippet

# cross entropy as pollyloss
def poly1_cross_entropy(logits, labels, epsilon=1.0):
    # pt, CE, and Poly1 have shape [batch].
    pt = tf.reduce_sum(labels * tf.nn.softmax(logits), axis=-1) CE = tf.nn.softmax_cross_entropy_with_logits(labels, logits)
    Poly1 = CE + epsilon * (1 - pt)
    return Poly1

# focal loss as pollyloss
def poly1_focal_loss(logits, labels, epsilon=1.0, gamma=2.0): 
    # p, pt, FL, and Poly1 have shape [batch, num of classes]. p = tf.math.sigmoid(logits)
    pt = labels * p + (1 - labels) * (1 - p)
    FL = focal_loss(pt, gamma)
    Poly1 = FL + epsilon * tf.math.pow(1 - pt, gamma + 1)
    return Poly1

I would like to contribute to add this feature into torch

cc @albanD @mruberry @jbschlosser @walterddr @kshitij12345

Metadata

Metadata

Assignees

No one assigned

    Labels

    module: lossProblem is related to loss functionmodule: nnRelated to torch.nntriagedThis issue has been looked at a team member, and triaged and prioritized into an appropriate module

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions