Loading [MathJax]/extensions/MathMenu.js
Fakd: Feature-Affinity Based Knowledge Distillation for Efficient Image Super-Resolution | IEEE Conference Publication | IEEE Xplore

Fakd: Feature-Affinity Based Knowledge Distillation for Efficient Image Super-Resolution


Abstract:

Convolutional neural networks (CNNs) have been widely used in image super-resolution (SR). Most existing CNN-based methods focus on achieving better performance by design...Show More

Abstract:

Convolutional neural networks (CNNs) have been widely used in image super-resolution (SR). Most existing CNN-based methods focus on achieving better performance by designing deeper/wider networks, while suffering from heavy computational cost problem, thus hindering the deployment of such models in mobile devices with limited resources. To relieve such problem, we propose a novel and efficient SR model, named Feature Affinity-based Knowledge Distillation (FAKD), by transferring the structural knowledge of a heavy teacher model to a lightweight student model. To transfer the structural knowledge effectively, FAKD aims to distill the second-order statistical information from feature maps and trains a lightweight student network with low computational and memory cost. Experimental results demonstrate the efficacy of our method and the effectiveness over other knowledge distillation based methods in terms of both quantitative and visual metrics.
Date of Conference: 25-28 October 2020
Date Added to IEEE Xplore: 30 September 2020
ISBN Information:

ISSN Information:

Conference Location: Abu Dhabi, United Arab Emirates

Contact IEEE to Subscribe

References

References is not available for this document.