Ashish Patel 🇮🇳’s Post

𝗗𝗮𝘆-𝟰𝟮𝟰 𝗖𝗼𝗺𝗽𝘂𝘁𝗲𝗿 𝗩𝗶𝘀𝗶𝗼𝗻 𝗟𝗲𝗮𝗿𝗻𝗶𝗻𝗴 Long-Tailed Classification with Gradual Balanced Loss and Adaptive Feature Generation by Huazhong University of Science and Technology, China Follow me for a similar post: Ashish Patel ------------------------------------------------------------------- 𝗜𝗻𝘁𝗲𝗿𝗲𝘀𝘁𝗶𝗻𝗴 𝗙𝗮𝗰𝘁𝘀 : 🔸 This paper is published arxiv2022. 🪐 What is Long-Tailed Learning? Long-tailed learning, one of the most challenging problems in visual recognition, aims to train well-performing models from a large number of images that follow a long-tailed class distribution. ------------------------------------------------------------------- 𝗜𝗠𝗣𝗢𝗥𝗧𝗔𝗡𝗖𝗘 ➡️ The real-world data distribution is essentially long-tailed, which poses great challenge to the deep model. ➡️ In this work, we propose a new method, Gradual Balanced Loss and Adaptive Feature Generator (GLAG) to alleviate imbalance. ➡️ GLAG first learns a balanced and robust feature model with Gradual Balanced Loss, then fixes the feature model and augments the under-represented tail classes on the feature level with the knowledge from well-represented head classes. ➡️ And the generated samples are mixed up with real training samples during training epochs. ➡️ Gradual Balanced Loss is a general loss and it can combine with different decoupled training methods to improve the original performance. ➡️ State-of-the-art results have been achieved on long-tail datasets such as CIFAR100-LT, ImageNetLT, and iNaturalist, which demonstrates the effectiveness of GLAG for long-tailed visual recognition. #computervision #artificialintelligence #data

  • diagram

To view or add a comment, sign in

Explore content categories