cvpaper.challenge
CVPR2020論文サマリ
tag: knowledge-distillation
«
‹
1
2
›
»
Revisiting Knowledge Distillation via Label Smoothing Regularization
by: Hiroki Yamamoto
Knowledge Distillation
Transfer Learning
Knowledge As Priors: Cross-Modal Knowledge Generalization for Datasets Without Superior Knowledge
by: Masuyama Yoshiki
cross-modal
multi-modal
knowledge distillation
meta learning
Online Knowledge Distillation via Collaborative Learning
by: Yue Qiu
Knowledge Distillation
Distilling Cross-Task Knowledge via Relationship Matching
by: 岡本大和
Knowledge Distillation
Model Reuse
Knowledge Transfer
Cross-Task Learning
Embedding Learning
More Grounded Image Captioning by Distilling Image-Text Matching Model
by: Yue Qiu
Image Captioning
Knowledge Distillation
«
‹
1
2
›
»