[Fix] Keep the same weights before reassign in the PAA head#7032
[Fix] Keep the same weights before reassign in the PAA head#7032ZwwWayne merged 1 commit intoopen-mmlab:devfrom zimoqingfeng:master
Conversation
Codecov Report
@@ Coverage Diff @@
## master #7032 +/- ##
==========================================
- Coverage 62.35% 62.34% -0.02%
==========================================
Files 327 327
Lines 26129 26129
Branches 4424 4424
==========================================
- Hits 16293 16290 -3
- Misses 8969 8971 +2
- Partials 867 868 +1
Flags with carried forward coverage won't be shown. Click here to find out more.
Continue to review full report at Codecov.
|
|
@hhaAndroid can you please check the PR ? |
|
It looks like it's blocked at codecov. Is there anything I can do to help? @hhaAndroid @ZwwWayne |
|
zimoqingfeng,您好!您在MMDetection项目中给我们提的PR非常重要,感谢您付出私人时间帮助改进开源项目,相信很多开发者会从你的PR中受益。 Hi @zimoqingfeng !First of all, we want to express our gratitude for your significant PR in the MMDetection project. Your contribution is highly appreciated, and we are grateful for your efforts in helping improve this open-source project during your personal time. We believe that many developers will benefit from your PR. We would also like to invite you to join our Special Interest Group (SIG) private channel on Discord, where you can share your experiences, ideas, and build connections with like-minded peers. To join the SIG channel, simply message moderator— OpenMMLab on Discord or briefly share your open-source contributions in the #introductions channel and we will assist you. Look forward to seeing you there! Join us :https://discord.gg/raweFPmdzG If you have WeChat account,welcome to join our community on WeChat. You can add our assistant :openmmlabwx. Please add "mmsig + Github ID" as a remark when adding friends:) |
Motivation
Worried about "Fix wrong bbox loss_weight of the PAA head (#6744)" in MMDetection V2.20.0.
According to the PAA's source code, it seems that the authors calculated the cls_loss and reg_loss with the same weight (always equal to 1.0) before reassignment, and calculated the final loss with different loss_weight
https://github.com/kkhoot/PAA/blob/master/paa_core/modeling/rpn/paa/loss.py#L306
https://github.com/kkhoot/PAA/blob/master/paa_core/modeling/rpn/paa/loss.py#L356
Modification
Before reassignment, always set loss_weight as 1.0 (maybe it will lead to some confusion for setting all losses as self.loss_cls.loss_weight)