Abstract
The existing mutual-deep-learning based knowledge distillation methods have the limitations: the discrepancy between the teacher network and the student network is only used to supervise the knowledge transfer neglecting other constraints, and the result-driven supervision is only used neglecting process-driven supervision. This paper proposes a topology-guided adversarial deep mutual learning network (TADML). This method trains multiple classification sub-networks of the same task simultaneously and each sub-network learns from others. Moreover, our method uses an adversarial network to adaptively measure the differences between pairwise sub-networks and optimizes the features without changing the model structure. The experimental results on three classification datasets: CIFAR10, CIFAR100 and Tiny-ImageNet and a person re-identification dataset Market1501 show that our method has achieved the best results among similar model compression methods.
| Translated title of the contribution | Topology-guided Adversarial Deep Mutual Learning for Knowledge Distillation |
|---|---|
| Original language | Chinese (Traditional) |
| Pages (from-to) | 102-110 |
| Number of pages | 9 |
| Journal | Zidonghua Xuebao/Acta Automatica Sinica |
| Volume | 49 |
| Issue number | 1 |
| DOIs | |
| State | Published - Jan 2023 |