Skip to main content
Log in

Multi-exit self-distillation with appropriate teachers

具备合适教师的多出口自蒸馏

  • Research Article
  • Published:
Frontiers of Information Technology & Electronic Engineering Aims and scope Submit manuscript

Abstract

Multi-exit architecture allows early-stop inference to reduce computational cost, which can be used in resource-constrained circumstances. Recent works combine the multi-exit architecture with self-distillation to simultaneously achieve high efficiency and decent performance at different network depths. However, existing methods mainly transfer knowledge from deep exits or a single ensemble to guide all exits, without considering that inappropriate learning gaps between students and teachers may degrade the model performance, especially in shallow exits. To address this issue, we propose Multi-exit self-distillation with Appropriate TEachers (MATE) to provide diverse and appropriate teacher knowledge for each exit. In MATE, multiple ensemble teachers are obtained from all exits with different trainable weights. Each exit subsequently receives knowledge from all teachers, while focusing mainly on its primary teacher to keep an appropriate gap for efficient knowledge transfer. In this way, MATE achieves diversity in knowledge distillation while ensuring learning efficiency. Experimental results on CIFAR-100, TinyImageNet, and three fine-grained datasets demonstrate that MATE consistently outperforms state-of-the-art multi-exit self-distillation methods with various network architectures.

摘要

多出口架构允许早停推理以减少计算成本, 这使其可以在资源受限的情况下使用。最近的研究将多出口架构与自蒸馏相结合, 以在不同网络深度上同时实现高效率和卓越性能。然而, 现有方法主要从深层出口或单一集成中传递知识, 以指导所有出口, 而没有考虑学生和教师之间不适当的学习差距可能会降低模型性能, 特别是对于浅层出口而言。为解决这个问题, 提出具备合适教师的多出口自蒸馏方法, 为每个出口提供多样化且适当的教师知识。在我们的方法中, 根据不同可训练的集成权重, 从所有出口获得多个集成教师。每个出口从所有教师那里接收知识, 并重点关注其所对应的主教师, 以保持适当的学习差距并实现高效的知识传递。通过这种方式, 我们的方法在保证学习效率的同时实现了多样化的知识蒸馏。在CIFAR-100、TinyImageNet以及3个细粒度数据集上的实验结果表明, 我们的方法在各种网络架构中始终优于最先进的多出口自蒸馏方法。

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Data availability

The data that support the findings of this study are openly available. The data based on CIFAR-100 are available from https://www.cs.toronto.edu/~kriz/cifar.html.

The data based on TinyImageNet are available from http://cs231n.stanford.edu/tiny-imagenet-200.zip.

The data based on CUB-200-2011 are available from https://www.vision.caltech.edu/datasets/cub_200_2011/. The data based on Stanford Dogs are available from http://vision.stanford.edu/aditya86/ImageNetDogs/. The data based on FGVC-Aircraft are available from https://www.robots.ox.ac.uk/~vgg/data/fgvc-aircraft/.

References

Download references

Acknowledgements

The authors would like to thank the advanced computing resources provided by the Supercomputing Center of Hangzhou City University, China.

Author information

Authors and Affiliations

Authors

Contributions

Wujie SUN designed the research, processed the data, and drafted the paper. Defang CHEN, Can WANG, Deshi YE, Yan FENG, and Chun CHEN helped organize the paper. All the authors revised and finalized the paper.

Corresponding author

Correspondence to Can Wang  (王灿).

Ethics declarations

All the authors declare that they have no conflict of interest.

Additional information

Project supported by the National Natural Science Foundation of China (No. U1866602) and the Starry Night Science Fund of Zhejiang University Shanghai Institute for Advanced Study, China (No. SN-ZJU-SIAS-001)

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Sun, W., Chen, D., Wang, C. et al. Multi-exit self-distillation with appropriate teachers. Front Inform Technol Electron Eng 25, 585–599 (2024). https://doi.org/10.1631/FITEE.2200644

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1631/FITEE.2200644

Key words

关键词

CLC number

Navigation