Loading [MathJax]/extensions/MathMenu.js
Learning continuous piecewise non-linear activation functions for deep neural networks | IEEE Conference Publication | IEEE Xplore

Learning continuous piecewise non-linear activation functions for deep neural networks


Abstract:

Activation functions provide the non-linearity to deep neural networks, which are crucial for the optimization and performance improvement. In this paper, we propose a le...Show More

Abstract:

Activation functions provide the non-linearity to deep neural networks, which are crucial for the optimization and performance improvement. In this paper, we propose a learnable continuous piece-wise nonlinear activation function (or CPN in short), which improves the widely used ReLU from three directions, i.e., finer pieces, non-linear terms and learnable parameterization. CPN is a continuous activation function with multiple pieces and incorporates non-linear terms in every interval. We give a general formulation of CPN and provide different implementations according to three key factors: whether the activation space is divided uniformly or not, whether the non-linear terms exist or not, and whether the activation function is continuous or not. We demonstrate the effectiveness of our method on image classification and single image super-resolution tasks by simply changing the activation function. For example, CPN improves 4.78% / 4.52% top-1 accuracy over ReLU on MobileNetV2_0.25 / MobileNetV2_0.35 for ImageNet classification and achieves better PSNR on several benchmarks for super-resolution. Our implementation is available at https://github.com/xc-G/CPN.
Date of Conference: 10-14 July 2023
Date Added to IEEE Xplore: 25 August 2023
ISBN Information:

ISSN Information:

Conference Location: Brisbane, Australia

Funding Agency:


Contact IEEE to Subscribe

References

References is not available for this document.