Brand
    cvpaper.challenge

CVPR2022論文サマリ

tag: knowledge-distillation

  • «
  • ‹
Evaluation-Oriented Knowledge Distillation for Deep Face Recognition
by: 鈴木共生
Knowledge distillation Recognition Face
Knowledge Distillation As Efficient Pre-Training: Faster Convergence, Higher Data-Efficiency, and Better Transferability
by: 田所龍
Knowledge distillation Representation learning
TransGeo: Transformer Is All You Need for Cross-View Image Geo-Localization
by: Anonymous
Attetion Knowledge distillation Object detection
MiniViT: Compressing Vision Transformers With Weight Multiplexing
by: Anonymous
Knowledge distillation Object detection Recognition
CHEX: CHannel EXploration for CNN Model Compression
by: Ryo Takahashi
Knowledge distillation Recognition
  • «
  • ‹
©2019 cvpaper.challenge