𝗗𝗮𝘆-𝟭𝟴𝟰 Computer Vision Learning 𝗥𝗲𝗫𝗡𝗲𝘁: Diminishing Representational Bottleneck on Convolutional Neural Network by Clova AI Research, NAVER Corp Follow me for a similar post: 🇮🇳 Ashish Patel Interesting Facts : 🔸 This is a paper in CVPR 2021 with over 1 citations. 🔸 It outperforms Mobilenetv1,V2,v3, Efficient Net etc. ------------------------------------------------------------------- 𝗔𝗺𝗮𝘇𝗶𝗻𝗴 𝗥𝗲𝘀𝗲𝗮𝗿𝗰𝗵 : https://lnkd.in/erC2RRn Code : https://lnkd.in/euAuJa3 ------------------------------------------------------------------- 𝗜𝗠𝗣𝗢𝗥𝗧𝗔𝗡𝗖𝗘 🔸 This paper addresses representational bottleneck in a network and propose a set of design principles that improves model performance significantly. 🔸 Authors argue that a representational bottleneck may happen in a network designed by a conventional design and results in degrading the model performance. To investigate the representational bottleneck, they study the matrix rank of the features generated by ten thousand random networks and further study the entire layer’s channel configuration towards designing more accurate network architectures. 🔸 Based on the investigation, they propose simple yet effective design principles to mitigate the representational bottleneck. Slight changes on baseline networks by following the principle leads to achieving remarkable performance improvements on ImageNet classification. 🔸 Interestingly, It shows that the IntroVAE converges to a distribution that minimizes a sum of KL distance from the data distribution and an entropy term. #computervision #artificialintelligence #data