A Dynamic Rectified Linear Activation Units | IEEE Journals & Magazine | IEEE Xplore

A Dynamic Rectified Linear Activation Units


In the backpropagation calculation process, the MSE value of the loss function is also sent to the activation function layer at the same time to update the DreLu activati...

Abstract:

Deep neural network regression models produce substantial gains in big data prediction systems. Multilayer perceptron neural (MPL) networks have more various properties t...Show More

Abstract:

Deep neural network regression models produce substantial gains in big data prediction systems. Multilayer perceptron neural (MPL) networks have more various properties than single-layer feedforward neural networks. A deeper neural network is more intelligent and sophisticated, which is one of the main research directions. However, the disappearing gradient is the primary problem that restricts the research. The appropriate activation function is one of the effective methods for solving this problem. A bold idea about activation functions emerged: if the activation function is different in two adjacent training epochs, the probability of the same gradient value will be small. We proposed a novel activation function whose shape can be changed dynamically in training. Our experimental results show that this activation function with “dynamic” characteristics can effectively avoid the disappearing gradient and can make the multilayer perceptron neural networks deeper.
In the backpropagation calculation process, the MSE value of the loss function is also sent to the activation function layer at the same time to update the DreLu activati...
Published in: IEEE Access ( Volume: 7)
Page(s): 180409 - 180416
Date of Publication: 12 December 2019
Electronic ISSN: 2169-3536

Funding Agency:


References

References is not available for this document.