跳到主要导航 跳到搜索 跳到主要内容

Harmonizing Knowledge Transfer in Neural Network with Unified Distillation

  • East China Normal University

科研成果: 书/报告/会议事项章节会议稿件同行评审

摘要

Knowledge distillation (KD), known for its ability to transfer knowledge from a cumbersome network (teacher) to a lightweight one (student) without altering the architecture, has been garnering increasing attention. Two primary categories emerge within KD methods: feature-based, focusing on intermediate layers’ features, and logits-based, targeting the final layer’s logits. This paper introduces a novel perspective by leveraging diverse knowledge sources within a unified KD framework. Specifically, we aggregate features from intermediate layers into a comprehensive representation, effectively gathering semantic information from different stages and scales. Subsequently, we predict the distribution parameters from this representation. These steps transform knowledge from the intermediate layers into corresponding distributive forms, thereby allowing for knowledge distillation through a unified distribution constraint at different stages of the network, ensuring the comprehensiveness and coherence of knowledge transfer. Numerous experiments were conducted to validate the effectiveness of the proposed method.

源语言英语
主期刊名Computer Vision – ECCV 2024 - 18th European Conference, Proceedings
编辑Aleš Leonardis, Elisa Ricci, Stefan Roth, Olga Russakovsky, Torsten Sattler, Gül Varol
出版商Springer Science and Business Media Deutschland GmbH
58-74
页数17
ISBN(印刷版)9783031734137
DOI
出版状态已出版 - 2025
活动18th European Conference on Computer Vision, ECCV 2024 - Milan, 意大利
期限: 29 9月 20244 10月 2024

出版系列

姓名Lecture Notes in Computer Science
15091 LNCS
ISSN(印刷版)0302-9743
ISSN(电子版)1611-3349

会议

会议18th European Conference on Computer Vision, ECCV 2024
国家/地区意大利
Milan
时期29/09/244/10/24

指纹

探究 'Harmonizing Knowledge Transfer in Neural Network with Unified Distillation' 的科研主题。它们共同构成独一无二的指纹。

引用此