- …
- …
#41
summarized by : Anonymous
どんな論文か?
transfer attention map from high-resolution net to low-resolution net to improve low resolution face recognition;
新規性
designed cosine similarity based attention (spatial + channel) distillation
結果
better than other distillation methods
その他(なぜ通ったか?等)
distillation for attention from cnn (e.g., CMAB block), not attention from Transformer
- …
- …