#41
summarized by : Anonymous
Teaching Where to Look: Attention Similarity Knowledge Distillation for Low Resolution Face Recognition

どんな論文か?

transfer attention map from high-resolution net to low-resolution net to improve low resolution face recognition;

新規性

designed cosine similarity based attention (spatial + channel) distillation

結果

better than other distillation methods

その他(なぜ通ったか?等)

distillation for attention from cnn (e.g., CMAB block), not attention from Transformer