Brand
  • スケジュール
  • メンバー
  • リソース
  • 論文サマリ
    • ACL 2019
    • EMNLP 2019
    • ACL 2020
    • ACL 2021
  • cvpaper.challenge

論文サマリ

acl2021
tag: pre-training

  • «
  • ‹
  • ›
  • »
Are Pretrained Convolutions Better than Pretrained Transformers?
by: Shintaro Yamamoto
pre-training CNN
LayoutLMv2: Multi-modal Pre-training for Visually-rich Document Understanding
by: Yoshiki Kubotani
pre-training two-stream transformer layout multimodality
  • «
  • ‹
  • ›
  • »
©2019 nlpaper.challenge