- …
- …
#42
summarized by : Anonymous
どんな論文か?
investigating the impact of the backbone training strategy (trained from scratch, fine-tuned from a pre-trained initialization, or frozen at its pre-trained initialization)
新規性
identified freezing the backbone is a better way of reusing the classification features for object detection if with enough capacity for the remaining detection components (e.g., FPN, Cascades)
結果
better results were obtained on COCO and LVIS when freezing the backbone; classes with fewer annotations benefit more from the frozen backbone
その他(なぜ通ったか?等)
fine-tunning for longer push the weights far away from its pretrained initialization, thus competitive performance with training backbone from scratch for longer
- …
- …