- …
- …
#76
summarized by : kaikai zhao
どんな論文か?
investigate how to effectively conduct knowledge distillation among heterogeneous teacher-student pairs for detection tasks (difficulty is the semantic gap between different backbone features)
新規性
1) add a detection head with architecture homogeneous to the teacher head attached to the student; 2) distillation on heterogeneous detection head (classification branch);
結果
achieved significant improvement compared to current detection KD methods
その他(なぜ通ったか?等)
cnn, feature distillation
- …
- …