Analysis of performance difference when using knowledge distillation of efficient CNN-based super-resolution algorithm | IEEE Conference Publication | IEEE Xplore

Analysis of performance difference when using knowledge distillation of efficient CNN-based super-resolution algorithm


Abstract:

In this paper, we used RLFN, the NTIRE 2022 Efficient Super-Resolution Challenge winning model, to implement a lightweight and efficient super-resolution algorithm. We al...Show More

Abstract:

In this paper, we used RLFN, the NTIRE 2022 Efficient Super-Resolution Challenge winning model, to implement a lightweight and efficient super-resolution algorithm. We also apply knowledge distillation in a partially modified form of the PISR framework and analyze the qualitative and quantitative results to improve the performance while maintaining the cost of RLFN.
Date of Conference: 05-08 February 2023
Date Added to IEEE Xplore: 10 March 2023
ISBN Information:

ISSN Information:

Conference Location: Singapore

I. Introduction

Recently, various studies on super resolution are being actively conducted. Among them, studies are being conducted to make the SR algorithm light but efficient with good performance and operate it in real time even in low-power devices such as edge devices.

Contact IEEE to Subscribe

References

References is not available for this document.