Brand
    cvpaper.challenge

CVPR2020論文サマリ

tag: distillation

  • «
  • ‹
  • ›
  • »
Auxiliary Training: Towards Accurate and Robust Models
by: 福原吉博 (Yoshihiro Fukuhara)
robustness distillation trade-off
Neural Networks Are More Productive Teachers Than Human Raters: Active Mixup for Data-Efficient Knowledge Distillation From a Blackbox Model
by: Hiroki Ohashi
distillation active learning mix-up
Collaborative Distillation for Ultra-Resolution Universal Style Transfer
by: 野中琢登
Style Transfer Distillation
  • «
  • ‹
  • ›
  • »
©2019 cvpaper.challenge