Loading [MathJax]/extensions/MathMenu.js
Cross-Layer Graph Knowledge Distillation for Image Recognition | IEEE Conference Publication | IEEE Xplore

Cross-Layer Graph Knowledge Distillation for Image Recognition


Abstract:

Knowledge Distillation (KD) aims to improve a light-weight student network supervised by a large teacher network. The core idea of KD is to explore valuable knowledge fro...Show More

Abstract:

Knowledge Distillation (KD) aims to improve a light-weight student network supervised by a large teacher network. The core idea of KD is to explore valuable knowledge from the teacher. Previous works often extract information from a single sample, but ignore relation modeling among multiple samples between student and teacher. Therefore, we propose Cross-Layer Graph Knowledge Distillation (CLGKD) that conducts graph-augmented feature and relation distillation assisted by graph neural networks. We further propose a meta-learning mechanism to optimize cross-layer matching weights for promoting GKD among all student and teacher layers. Experimental results on image classification and object detection demonstrate that CLGKD achieves state-of-the-art performance compared to other KD methods. Our code is available at https://github.com/cynmzzz/ICASSP2025-CLGKD
Date of Conference: 06-11 April 2025
Date Added to IEEE Xplore: 07 March 2025
ISBN Information:

ISSN Information:

Conference Location: Hyderabad, India

Funding Agency:


References

References is not available for this document.