跳到主要导航 跳到搜索 跳到主要内容

AUGKD: INGENIOUS AUGMENTATIONS EMPOWER KNOWLEDGE DISTILLATION FOR IMAGE SUPER-RESOLUTION

  • Yun Zhang
  • , Wei Li
  • , Simiao Li
  • , Hanting Chen
  • , Zhijun Tu
  • , Bingyi Jing
  • , Shaohui Lin
  • , Jie Hu*
  • , Wenjia Wang*
  • *此作品的通讯作者

科研成果: 书/报告/会议事项章节会议稿件同行评审

摘要

Knowledge distillation (KD) compresses deep neural networks by transferring task-related knowledge from cumbersome pre-trained teacher models to more compact student models. However, vanilla KD for image super-resolution (SR) networks yields only limited improvements due to the inherent nature of SR tasks, where the outputs of teacher models are noisy approximations of high-quality label images. In this work, we show that the potential of vanilla KD has been underestimated and demonstrate that the ingenious application of data augmentation methods can close the gap between it and more complex, well-designed methods. Unlike conventional training processes typically applying image augmentations simultaneously to both low-quality inputs and high-quality labels, we propose AugKD utilizing unpaired data augmentations to 1) generate auxiliary distillation samples and 2) impose label consistency regularization. Comprehensive experiments show that the AugKD significantly outperforms existing state-of-the-art KD methods across a range of SR tasks.

源语言英语
主期刊名13th International Conference on Learning Representations, ICLR 2025
出版商International Conference on Learning Representations, ICLR
18119-18135
页数17
ISBN(电子版)9798331320850
出版状态已出版 - 2025
活动13th International Conference on Learning Representations, ICLR 2025 - Singapore, 新加坡
期限: 24 4月 202528 4月 2025

出版系列

姓名13th International Conference on Learning Representations, ICLR 2025

会议

会议13th International Conference on Learning Representations, ICLR 2025
国家/地区新加坡
Singapore
时期24/04/2528/04/25

指纹

探究 'AUGKD: INGENIOUS AUGMENTATIONS EMPOWER KNOWLEDGE DISTILLATION FOR IMAGE SUPER-RESOLUTION' 的科研主题。它们共同构成独一无二的指纹。

引用此