Abstract:
Recently, lightweight neural networks with different manual designs have presented a promising performance in single image super-resolution (SR). However, these designs r...Show MoreMetadata
Abstract:
Recently, lightweight neural networks with different manual designs have presented a promising performance in single image super-resolution (SR). However, these designs rely on too much expert experience. To address this issue, we focus on searching a lightweight block for efficient and accurate image SR. Due to the frequent use of various residual blocks and attention mechanisms in SR methods, we propose the residual attention search block (RASB) which combines an operation search block (OSB) with an attention search block (ASB). The former is used to explore the suitable operation at the proper position, and the latter is applied to discover the optimal connection of various attention mechanisms. Moreover, we build the modified residual attention network (MRAN) with stacked found blocks and a refinement module. Extensive experiments demonstrate that our MRAN achieves a better trade-off against the state-of-the-art methods in terms of accuracy and model complexity.
Date of Conference: 05-09 July 2021
Date Added to IEEE Xplore: 09 June 2021
ISBN Information: